Compare commits

..

8 Commits

Author SHA1 Message Date
DarkSky
52f4461fff chore: bump deps 2026-03-11 15:50:07 +08:00
renovate[bot]
d7d67841b8 chore: bump up file-type version to v21.3.1 [SECURITY] (#14625)
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [file-type](https://redirect.github.com/sindresorhus/file-type) |
[`21.3.0` →
`21.3.1`](https://renovatebot.com/diffs/npm/file-type/21.3.0/21.3.1) |
![age](https://developer.mend.io/api/mc/badges/age/npm/file-type/21.3.1?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/file-type/21.3.0/21.3.1?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-31808](https://redirect.github.com/sindresorhus/file-type/security/advisories/GHSA-5v7r-6r5c-r473)

### Impact
A denial of service vulnerability exists in the ASF (WMV/WMA) file type
detection parser. When parsing a crafted input where an ASF sub-header
has a `size` field of zero, the parser enters an infinite loop. The
`payload` value becomes negative (-24), causing
`tokenizer.ignore(payload)` to move the read position backwards, so the
same sub-header is read repeatedly forever.

Any application that uses `file-type` to detect the type of
untrusted/attacker-controlled input is affected. An attacker can stall
the Node.js event loop with a 55-byte payload.

### Patches
Fixed in version 21.3.1. Users should upgrade to >= 21.3.1.

### Workarounds
Validate or limit the size of input buffers before passing them to
`file-type`, or run file type detection in a worker thread with a
timeout.

### References
- Fix commit: 319abf871b50ba2fa221b4a7050059f1ae096f4f

### Reporter

crnkovic@lokvica.com

---

### Release Notes

<details>
<summary>sindresorhus/file-type (file-type)</summary>

###
[`v21.3.1`](https://redirect.github.com/sindresorhus/file-type/releases/tag/v21.3.1)

[Compare
Source](https://redirect.github.com/sindresorhus/file-type/compare/v21.3.0...v21.3.1)

- Fix infinite loop in ASF parser on malformed input
[`319abf8`](https://redirect.github.com/sindresorhus/file-type/commit/319abf8)

***

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41OS4wIiwidXBkYXRlZEluVmVyIjoiNDMuNTkuMCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-11 13:58:31 +08:00
DarkSky
29a27b561b feat(server): migrate copilot to native (#14620)
#### PR Dependency Tree


* **PR #14620** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Native LLM workflows: structured outputs, embeddings, and reranking
plus richer multimodal attachments (images, audio, files) and improved
remote-attachment inlining.

* **Refactor**
* Tooling API unified behind a local tool-definition helper;
provider/adapters reorganized to route through native dispatch paths.

* **Chores**
* Dependency updates, removed legacy Google SDK integrations, and
increased front memory allocation.

* **Tests**
* Expanded end-to-end and streaming tests exercising native provider
flows, attachments, and rerank/structured scenarios.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-11 13:55:35 +08:00
renovate[bot]
02744cec00 chore: bump up apple/swift-collections version to from: "1.4.0" (#14616)
This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
|
[apple/swift-collections](https://redirect.github.com/apple/swift-collections)
| minor | `from: "1.3.0"` → `from: "1.4.0"` |

---

### Release Notes

<details>
<summary>apple/swift-collections (apple/swift-collections)</summary>

###
[`v1.4.0`](https://redirect.github.com/apple/swift-collections/releases/tag/1.4.0):
Swift Collections 1.4.0

[Compare
Source](https://redirect.github.com/apple/swift-collections/compare/1.3.0...1.4.0)

This feature release supports Swift toolchain versions 6.0, 6.1 and 6.2.
It includes a variety of bug fixes, and ships the following new
features:

##### New ownership-aware ring buffer and hashed container
implementations

In the `DequeModule` module, we have two new source-stable types that
provide ownership-aware ring buffer implementations:

- [`struct UniqueDeque<Element>`][UniqueDeque] is a uniquely held,
dynamically resizing, noncopyable deque.
- [`struct RigidDeque<Element>`][RigidDeque] is a fixed-capacity deque
implementation.

`RigidDeque`/`UniqueDeque` are to `Deque` like
`RigidArray`/`UniqueArray` are to `Array` -- they provide noncopyable
embodiments of the same basic data structure, with many of the same
operations.

[UniqueDeque]:
https://swiftpackageindex.com/apple/swift-collections/documentation/dequemodule/uniquedeque

[RigidDeque]:
https://swiftpackageindex.com/apple/swift-collections/documentation/dequemodule/rigiddeque

In the `BasicContainers` module, this release adds previews of four new
types, implementing ownership-aware hashed containers:

- [`struct UniqueSet<Element>`][UniqueSet] is a uniquely held,
dynamically resizing set.
- [`struct RigidSet<Element>`][RigidSet] is a fixed-capacity set.
- [`struct UniqueDictionary<Key, Value>`][UniqueDictionary] is a
uniquely held, dynamically resizing dictionary.
- [`struct RigidDictionary<Key, Value>`][RigidDictionary] is a
fixed-capacity dictionary.

[RigidSet]:
https://redirect.github.com/apple/swift-collections/tree/main/Sources/BasicContainers/RigidSet

[UniqueSet]:
https://redirect.github.com/apple/swift-collections/tree/main/Sources/BasicContainers/UniqueSet

[RigidDictionary]:
https://redirect.github.com/apple/swift-collections/tree/main/Sources/BasicContainers/RigidDictionary

[UniqueDictionary]:
https://redirect.github.com/apple/swift-collections/tree/main/Sources/BasicContainers/UniqueDictionary

These are direct analogues of the standard `Set` and `Dictionary` types.
These types are built on top of the `Equatable` and `Hashable` protocol
generalizations that were proposed in [SE-0499]; as that proposal is not
yet implemented in any shipping toolchain, these new types are shipping
as source-unstable previews, conditional on a new
`UnstableHashedContainers` package trait. The final API of these types
will also deeply depend on the `struct Borrow` and `struct Inout`
proposals (and potentially other language/stdlib improvements) that are
currently working their way through the Swift Evolution process.
Accordingly, we may need to make source-breaking changes to the
interfaces of these types -- they are not ready to be blessed as Public
API. However, we encourage intrepid engineers to try them on for size,
and report pain points. (Of which we expect there will be many in this
first preview.)

[SE-0499]:
https://redirect.github.com/swiftlang/swift-evolution/blob/main/proposals/0499-support-non-copyable-simple-protocols.md

We continue the pattern of `Rigid-` and `Unique-` naming prefixes with
these new types:

- The `Unique` types (`UniqueArray`, `UniqueDeque`, `UniqueSet`,
`UniqueDictionary` etc.) are dynamically self-sizing containers that
automatically reallocate their storage as needed to best accommodate
their contents; the `Unique` prefix was chosen to highlight that these
types are always uniquely held, avoiding the complications of mutating
shared copies.
- The `Rigid` types remove dynamic sizing, and they operate strictly
within an explicitly configured capacity. Dynamic sizing is not always
appropriate -- when targeting space- or time-constrained environments
(think embedded use cases or real-time work), it is preferable to avoid
implicit reallocations, and to instead choose to have explicit control
over when (and if) storage is reallocated, and to what size. This is
where the `Rigid` types come in: their instances are created with a
specific capacity and it is a runtime error to exceed that. This makes
them quite inflexible (hence the "rigid" qualifier), but in exchange,
their operations provide far stricter complexity guarantees: they
exhibit no random runtime latency spikes, and they can trivially fit in
strict memory budgets.

##### Early drafts of borrowing sequence, generative iteration and
container protocols

This release includes highly experimental but *working* implementations
of new protocols supplying ownership-aware alternatives to the classic
`Sequence`/`Collection` protocol hierarchy. These protocols and the
generic operations built on top of them can be turned on by enabling the
`UnstableContainersPreview` package trait.

- [`protocol BorrowingSequence<Element>`][BorrowingSequence] models
borrowing sequences with ephemeral lifetimes. (This is already
progressing through Swift Evolution.)
- [`protocol Container<Element>`][Container] models constructs that
physically store their contents, and can expose stable spans over them.
- [`protocol Producer<Element, ProducerError>`][Producer] models a
generative iterator -- a construct that generates items demand.
- [`protocol Drain<Element>`][Drain] refines `Producer` to model an
in-place consumable elements -- primarily for use around container
types.

[BorrowingSequence]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/BorrowingSequence.swift

[BorrowingIteratorProtocol]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/BorrowingIteratorProtocol.swift

[Container]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Container.swift

[Producer]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Producer.swift

[Drain]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Protocols/Drain.swift

In this version, the package has developed these protocols just enough
to implement basic generic operations for moving data between containers
like `UniqueArray` and `RigidDeque`. As we gain experience using these,
future releases may start adding basic generic algorithms, more
protocols (bidirectional, random-access, (per)mutable, range-replaceable
containers etc.) convenience adapters, and other features -- or we may
end up entirely overhauling or simply discarding some/all of them.
Accordingly, the experimental interfaces enabled by
`UnstableContainersPreview` are not source stable, and they are not
intended for production use. We expect the eventual production version
of these (or whatever designs they evolve into) to ship in the Swift
Standard Library. We do highly recommend interested folks to try playing
with these, to get a feel for the strange problems of Ownership.

Besides these protocols, the package also defines rudimentary
substitutes of some basic primitives that belong in the Standard
Library:

- [`struct InputSpan<Element>`][InputSpan] the dual of `OutputSpan` --
while `OutputSpan` is primarily for moving items *into* somebody else's
storage, `InputSpan` enables safely moving items *out of* storage.
- [`struct Borrow<Target>`][Borrow] represents a borrowing reference to
an item. (This package models this with a pointer, which is an
ill-fitting substitute for the real implementation in the stdlib.)
- [`struct Inout<Target>`][Inout] represents a mutating reference to an
item.

[InputSpan]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Types/InputSpan.swift

[Borrow]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Types/Borrow.swift

[Inout]:
https://redirect.github.com/apple/swift-collections/blob/main/Sources/ContainersPreview/Types/Inout.swift

##### A formal way to access `SortedSet` and `SortedDictionary` types

The `SortedCollections` module contains (preexisting) early drafts of
two sorted collection types `SortedSet` and `SortedDictionary`, built on
top of an in-memory B-tree implementation. This release defines an
`UnstableSortedCollections` package trait that can be used to enable
building these types for experimentation without manually modifying the
package. Like in previous releases, these implementations remain
unfinished in this release, with known API issues; accordingly, these
types remain unstable. (Issue
[#&#8203;1](https://redirect.github.com/apple/swift-collections/issues/1)
remains open.) Future package releases may change their interface in
ways that break source compatibility, or they may remove these types
altogether.

##### Minor interface-level changes

- The `Collections` module no longer uses the unstable `@_exported
import` feature. Instead, it publishes public typealiases of every type
that it previously reexported from `DequeModule`, `OrderedCollections`,
`BitCollections`, `HeapModule` and `HashTreeCollections`.

- We renamed some `RigidArray`/`UniqueArray` operations to improve their
clarity at the point of use. The old names are still available, but
deprecated.

| Old name | New name |
| ----------------------------------------------- |
------------------------------------------------- |
| `append(count:initializingWith:)` |
`append(addingCount:initializingWith:)` |
| `insert(count:at:initializingWith:)` |
`insert(addingCount:at:initializingWith:)` |
| `replaceSubrange(_:newCount:initializingWith:)` |
`replace(removing:addingCount:initializingWith:)` |
| `replaceSubrange(_:moving:)` | `replace(removing:moving:)` |
| `replaceSubrange(_:copying:)` | `replace(removing:copying:)` |
| `copy()` | `clone()` |
| `copy(capacity:)` | `clone(capacity:)` |

- We have now defined a complete set of `OutputSpan`/`InputSpan`-based
`append`/`insert`/`replace`/`consume` primitives, fully generalized to
be implementable by piecewise contiguous containers. These operations
pave the way for a `Container`-based analogue of the classic
`RangeReplaceableCollection` protocol, with most of the user-facing
operations becoming standard generic algorithms built on top of these
primitives:

  ```
  mutating func append<E: Error>(
      addingCount newItemCount: Int,
initializingWith initializer: (inout OutputSpan<Element>) throws(E) ->
Void
  )

  mutating func insert<E: Error>(
     addingCount newItemCount: Int,
     at index: Int,
initializingWith initializer: (inout OutputSpan<Element>) throws(E) ->
Void
  ) throws(E)

  mutating func replace<E: Error>(
      removing subrange: Range<Int>,
      consumingWith consumer: (inout InputSpan<Element>) -> Void,
      addingCount newItemCount: Int,
initializingWith initializer: (inout OutputSpan<Element>) throws(E) ->
Void
  ) throws(E)

  mutating func consume(
      _ subrange: Range<Int>,
      consumingWith consumer: (inout InputSpan<Element>) -> Void
  )
  ```

- The package no longer uses the code generation tool `gyb`.

#### What's Changed

- Fix links in GitHub templates by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;527](https://redirect.github.com/apple/swift-collections/pull/527)
- Adopt `package` access modifier and get rid of gybbing by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;526](https://redirect.github.com/apple/swift-collections/pull/526)
- \[Doc] Fix links in landing page by
[@&#8203;Azoy](https://redirect.github.com/Azoy) in
[#&#8203;531](https://redirect.github.com/apple/swift-collections/pull/531)
- \[BigString] Refactor \_Chunk to be its own managed buffer of UTF8 by
[@&#8203;Azoy](https://redirect.github.com/Azoy) in
[#&#8203;488](https://redirect.github.com/apple/swift-collections/pull/488)
- Add new package trait UnstableSortedCollections by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;533](https://redirect.github.com/apple/swift-collections/pull/533)
- \[RopeModule] Fix warnings by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;534](https://redirect.github.com/apple/swift-collections/pull/534)
- Fix ability to build & test BigString with Xcode & CMake by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;537](https://redirect.github.com/apple/swift-collections/pull/537)
- \[BigString] Bring back Index.\_isUTF16TrailingSurrogate by
[@&#8203;Azoy](https://redirect.github.com/Azoy) in
[#&#8203;539](https://redirect.github.com/apple/swift-collections/pull/539)
- chore: restrict GitHub workflow permissions - future-proof by
[@&#8203;incertum](https://redirect.github.com/incertum) in
[#&#8203;540](https://redirect.github.com/apple/swift-collections/pull/540)
- \[BitCollections] Add missing imports for InternalCollectionsUtilities
by [@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;554](https://redirect.github.com/apple/swift-collections/pull/554)
- Compare self.value to other, not itself by
[@&#8203;SiliconA-Z](https://redirect.github.com/SiliconA-Z) in
[#&#8203;553](https://redirect.github.com/apple/swift-collections/pull/553)
- Change useFloyd heuristic to match comment by
[@&#8203;SiliconA-Z](https://redirect.github.com/SiliconA-Z) in
[#&#8203;551](https://redirect.github.com/apple/swift-collections/pull/551)
- Typo: symmetric difference should be the xor, not intersection by
[@&#8203;SiliconA-Z](https://redirect.github.com/SiliconA-Z) in
[#&#8203;550](https://redirect.github.com/apple/swift-collections/pull/550)
- first should get the Initialized elements by
[@&#8203;SiliconA-Z](https://redirect.github.com/SiliconA-Z) in
[#&#8203;549](https://redirect.github.com/apple/swift-collections/pull/549)
- Replace Container with a far less powerful (but more universal)
Iterable construct by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;543](https://redirect.github.com/apple/swift-collections/pull/543)
- Temporarily stop testing RigidArray & UniqueArray on release/6.3
snapshots on Linux by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;562](https://redirect.github.com/apple/swift-collections/pull/562)
- \[RigidArray, HashTrees] Mark deinitializers inlinable by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;560](https://redirect.github.com/apple/swift-collections/pull/560)
- GHA: Add weekly dependabot by
[@&#8203;bkhouri](https://redirect.github.com/bkhouri) in
[#&#8203;563](https://redirect.github.com/apple/swift-collections/pull/563)
- Work around temporary issue with current 6.3 snapshots by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;565](https://redirect.github.com/apple/swift-collections/pull/565)
- Add `RigidDeque` and `UniqueDeque` by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;557](https://redirect.github.com/apple/swift-collections/pull/557)
- \[Collections module] Stop using `@_exported import` by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;566](https://redirect.github.com/apple/swift-collections/pull/566)
- Delete stray benchmark results files by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;567](https://redirect.github.com/apple/swift-collections/pull/567)
- Assorted `RigidArray`/`UniqueArray` updates by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;569](https://redirect.github.com/apple/swift-collections/pull/569)
- `RigidArray`/`UniqueArray`: Add new copying span initializers by
[@&#8203;Azoy](https://redirect.github.com/Azoy) in
[#&#8203;572](https://redirect.github.com/apple/swift-collections/pull/572)
- `RigidDeque`/`UniqueDeque`: Add some top-level documentation by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;571](https://redirect.github.com/apple/swift-collections/pull/571)
- Update docs for Container.nextSpan(after:maximumCount:) by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;574](https://redirect.github.com/apple/swift-collections/pull/574)
- Remove workaround for bug in OutputSpan.wUMBP by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;570](https://redirect.github.com/apple/swift-collections/pull/570)
- \[RigidArray, RigidDeque].nextSpan: Validate `maximumCount` by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;575](https://redirect.github.com/apple/swift-collections/pull/575)
- Bump swiftlang/github-workflows/.github/workflows/soundness.yml from
0.0.6 to 0.0.7 by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;577](https://redirect.github.com/apple/swift-collections/pull/577)
- give constant folding an opportunity to select a much faster code path
for empty dictionary (and set) literals by
[@&#8203;tayloraswift](https://redirect.github.com/tayloraswift) in
[#&#8203;578](https://redirect.github.com/apple/swift-collections/pull/578)
- Bump
swiftlang/github-workflows/.github/workflows/swift\_package\_test.yml
from 0.0.6 to 0.0.7 by
[@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot] in
[#&#8203;576](https://redirect.github.com/apple/swift-collections/pull/576)
- Ownership-aware Set and Dictionary variants by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;573](https://redirect.github.com/apple/swift-collections/pull/573)
- \[Prerelease] Check API for consistency, fill holes, patch
incoherencies by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;581](https://redirect.github.com/apple/swift-collections/pull/581)
- \[BitSet] Amend return value of `update(with:)` method by
[@&#8203;benrimmington](https://redirect.github.com/benrimmington) in
[#&#8203;538](https://redirect.github.com/apple/swift-collections/pull/538)
- \[BasicContainers] Fix spelling of a source file by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;585](https://redirect.github.com/apple/swift-collections/pull/585)
- Include notes about index mutation in `span(after/before:)` (+ other
doc fixes) by
[@&#8203;natecook1000](https://redirect.github.com/natecook1000) in
[#&#8203;541](https://redirect.github.com/apple/swift-collections/pull/541)
- \[BasicContainers] Finalize requirements for hashed containers by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;586](https://redirect.github.com/apple/swift-collections/pull/586)
- Update README for 1.4.0 by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;587](https://redirect.github.com/apple/swift-collections/pull/587)
- Working towards the 1.4.0 tag by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;588](https://redirect.github.com/apple/swift-collections/pull/588)
- \[BasicContainers] Avoid defining set/dictionary types unless
UnstableHashedContainers is enabled by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;589](https://redirect.github.com/apple/swift-collections/pull/589)
- \[BasicContainers] RigidArray: Correct spelling of replacement for
deprecated method by
[@&#8203;lorentey](https://redirect.github.com/lorentey) in
[#&#8203;590](https://redirect.github.com/apple/swift-collections/pull/590)

#### New Contributors

- [@&#8203;incertum](https://redirect.github.com/incertum) made their
first contribution in
[#&#8203;540](https://redirect.github.com/apple/swift-collections/pull/540)
- [@&#8203;SiliconA-Z](https://redirect.github.com/SiliconA-Z) made
their first contribution in
[#&#8203;553](https://redirect.github.com/apple/swift-collections/pull/553)
- [@&#8203;bkhouri](https://redirect.github.com/bkhouri) made their
first contribution in
[#&#8203;563](https://redirect.github.com/apple/swift-collections/pull/563)
- [@&#8203;dependabot](https://redirect.github.com/dependabot)\[bot]
made their first contribution in
[#&#8203;577](https://redirect.github.com/apple/swift-collections/pull/577)
- [@&#8203;tayloraswift](https://redirect.github.com/tayloraswift) made
their first contribution in
[#&#8203;578](https://redirect.github.com/apple/swift-collections/pull/578)
- [@&#8203;benrimmington](https://redirect.github.com/benrimmington)
made their first contribution in
[#&#8203;538](https://redirect.github.com/apple/swift-collections/pull/538)

**Full Changelog**:
<https://github.com/apple/swift-collections/compare/1.3.0...1.4.0>

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41OS4wIiwidXBkYXRlZEluVmVyIjoiNDMuNTkuMCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-09 12:31:54 +00:00
renovate[bot]
6d710f3bdc chore: bump up Node.js to v22.22.1 (#14598)
> ℹ️ **Note**
> 
> This PR body was truncated due to platform limits.

This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [node](https://nodejs.org)
([source](https://redirect.github.com/nodejs/node)) | patch | `22.22.0`
→ `22.22.1` |

---

### Release Notes

<details>
<summary>nodejs/node (node)</summary>

###
[`v22.22.1`](https://redirect.github.com/nodejs/node/releases/tag/v22.22.1):
2026-03-05, Version 22.22.1 &#x27;Jod&#x27; (LTS)

[Compare
Source](https://redirect.github.com/nodejs/node/compare/v22.22.0...v22.22.1)

##### Notable Changes

-
\[[`7b93a65f27`](https://redirect.github.com/nodejs/node/commit/7b93a65f27)]
- **build**: test on Python 3.14 (Christian Clauss)
[#&#8203;59983](https://redirect.github.com/nodejs/node/pull/59983)
-
\[[`6063d888fe`](https://redirect.github.com/nodejs/node/commit/6063d888fe)]
- **cli**: mark `--heapsnapshot-near-heap-limit` as stable (Joyee
Cheung)
[#&#8203;60956](https://redirect.github.com/nodejs/node/pull/60956)
-
\[[`d950b151a2`](https://redirect.github.com/nodejs/node/commit/d950b151a2)]
- **crypto**: update root certificates to NSS 3.119 (Node.js GitHub Bot)
[#&#8203;61419](https://redirect.github.com/nodejs/node/pull/61419)
-
\[[`4f42f8c428`](https://redirect.github.com/nodejs/node/commit/4f42f8c428)]
- **crypto**: update root certificates to NSS 3.117 (Node.js GitHub Bot)
[#&#8203;60741](https://redirect.github.com/nodejs/node/pull/60741)
-
\[[`b6ebf2cd53`](https://redirect.github.com/nodejs/node/commit/b6ebf2cd53)]
- **doc**: add avivkeller to collaborators (Aviv Keller)
[#&#8203;61115](https://redirect.github.com/nodejs/node/pull/61115)
-
\[[`35854f424d`](https://redirect.github.com/nodejs/node/commit/35854f424d)]
- **doc**: add gurgunday to collaborators (Gürgün Dayıoğlu)
[#&#8203;61094](https://redirect.github.com/nodejs/node/pull/61094)
-
\[[`5c6a076e5d`](https://redirect.github.com/nodejs/node/commit/5c6a076e5d)]
- **meta**: add Renegade334 to collaborators (Renegade334)
[#&#8203;60714](https://redirect.github.com/nodejs/node/pull/60714)

##### Commits

-
\[[`5f773488c2`](https://redirect.github.com/nodejs/node/commit/5f773488c2)]
- **assert**: use a set instead of an array for faster lookup (Ruben
Bridgewater)
[#&#8203;61076](https://redirect.github.com/nodejs/node/pull/61076)
-
\[[`feecbb0eab`](https://redirect.github.com/nodejs/node/commit/feecbb0eab)]
- **assert,util**: fix deep comparison for sets and maps with mixed
types (Ruben Bridgewater)
[#&#8203;61388](https://redirect.github.com/nodejs/node/pull/61388)
-
\[[`096095b127`](https://redirect.github.com/nodejs/node/commit/096095b127)]
- **benchmark**: add SQLite benchmarks (Guilherme Araújo)
[#&#8203;61401](https://redirect.github.com/nodejs/node/pull/61401)
-
\[[`b5fe481415`](https://redirect.github.com/nodejs/node/commit/b5fe481415)]
- **benchmark**: use boolean options in benchmark tests (SeokhunEom)
[#&#8203;60129](https://redirect.github.com/nodejs/node/pull/60129)
-
\[[`fa9faacacb`](https://redirect.github.com/nodejs/node/commit/fa9faacacb)]
- **benchmark**: allow boolean option values (SeokhunEom)
[#&#8203;60129](https://redirect.github.com/nodejs/node/pull/60129)
-
\[[`ba8714ac21`](https://redirect.github.com/nodejs/node/commit/ba8714ac21)]
- **benchmark**: fix incorrect base64 input in byteLength benchmark
(semimikoh)
[#&#8203;60841](https://redirect.github.com/nodejs/node/pull/60841)
-
\[[`53596de876`](https://redirect.github.com/nodejs/node/commit/53596de876)]
- **benchmark**: use typescript for import cjs benchmark (Joyee Cheung)
[#&#8203;60663](https://redirect.github.com/nodejs/node/pull/60663)
-
\[[`e8930e9d7c`](https://redirect.github.com/nodejs/node/commit/e8930e9d7c)]
- **benchmark**: focus on import.meta intialization in import-meta
benchmark (Joyee Cheung)
[#&#8203;60603](https://redirect.github.com/nodejs/node/pull/60603)
-
\[[`1155e412b1`](https://redirect.github.com/nodejs/node/commit/1155e412b1)]
- **benchmark**: add per-suite setup option (Joyee Cheung)
[#&#8203;60574](https://redirect.github.com/nodejs/node/pull/60574)
-
\[[`e01903d304`](https://redirect.github.com/nodejs/node/commit/e01903d304)]
- **benchmark**: improve cpu.sh for safety and usability (Nam Yooseong)
[#&#8203;60162](https://redirect.github.com/nodejs/node/pull/60162)
-
\[[`623a405747`](https://redirect.github.com/nodejs/node/commit/623a405747)]
- **benchmark**: add benchmark for leaf source text modules (Joyee
Cheung)
[#&#8203;60205](https://redirect.github.com/nodejs/node/pull/60205)
-
\[[`7f5e7b9f7f`](https://redirect.github.com/nodejs/node/commit/7f5e7b9f7f)]
- **benchmark**: add microbench on isInsideNodeModules (Chengzhong Wu)
[#&#8203;60991](https://redirect.github.com/nodejs/node/pull/60991)
-
\[[`db132b85a8`](https://redirect.github.com/nodejs/node/commit/db132b85a8)]
- **bootstrap**: initialize http proxy after user module loader setup
(Joyee Cheung)
[#&#8203;58938](https://redirect.github.com/nodejs/node/pull/58938)
-
\[[`66aab9f987`](https://redirect.github.com/nodejs/node/commit/66aab9f987)]
- **buffer**: let Buffer.of use heap (Сковорода Никита Андреевич)
[#&#8203;60503](https://redirect.github.com/nodejs/node/pull/60503)
-
\[[`c3cf00c671`](https://redirect.github.com/nodejs/node/commit/c3cf00c671)]
- **buffer**: speed up concat via TypedArray#set (Gürgün Dayıoğlu)
[#&#8203;60399](https://redirect.github.com/nodejs/node/pull/60399)
-
\[[`f6fad231e9`](https://redirect.github.com/nodejs/node/commit/f6fad231e9)]
- **build**: skip sscache action on non-main branches (Joyee Cheung)
[#&#8203;61790](https://redirect.github.com/nodejs/node/pull/61790)
-
\[[`2145f91f6b`](https://redirect.github.com/nodejs/node/commit/2145f91f6b)]
- **build**: update android-patches/trap-handler.h.patch (Mo Luo)
[#&#8203;60369](https://redirect.github.com/nodejs/node/pull/60369)
-
\[[`5b49759dd8`](https://redirect.github.com/nodejs/node/commit/5b49759dd8)]
- **build**: update devcontainer.json to use paired nix env (Joyee
Cheung)
[#&#8203;61414](https://redirect.github.com/nodejs/node/pull/61414)
-
\[[`24724cde40`](https://redirect.github.com/nodejs/node/commit/24724cde40)]
- **build**: fix misplaced comma in ldflags (hqzing)
[#&#8203;61294](https://redirect.github.com/nodejs/node/pull/61294)
-
\[[`c57a19934e`](https://redirect.github.com/nodejs/node/commit/c57a19934e)]
- **build**: fix crate vendor file checksums on windows (Chengzhong Wu)
[#&#8203;61329](https://redirect.github.com/nodejs/node/pull/61329)
-
\[[`8659d7cd07`](https://redirect.github.com/nodejs/node/commit/8659d7cd07)]
- **build**: fix inconsistent quoting in `Makefile` (Antoine du Hamel)
[#&#8203;60511](https://redirect.github.com/nodejs/node/pull/60511)
-
\[[`44f339b315`](https://redirect.github.com/nodejs/node/commit/44f339b315)]
- **build**: remove temporal updater (Chengzhong Wu)
[#&#8203;61151](https://redirect.github.com/nodejs/node/pull/61151)
-
\[[`d60a6cebd5`](https://redirect.github.com/nodejs/node/commit/d60a6cebd5)]
- **build**: update test-wpt-report to use NODE instead of OUT\_NODE
(Filip Skokan)
[#&#8203;61024](https://redirect.github.com/nodejs/node/pull/61024)
-
\[[`34ccf187f5`](https://redirect.github.com/nodejs/node/commit/34ccf187f5)]
- **build**: skip build-ci on actions with a separate test step
(Chengzhong Wu)
[#&#8203;61073](https://redirect.github.com/nodejs/node/pull/61073)
-
\[[`7b19e101a2`](https://redirect.github.com/nodejs/node/commit/7b19e101a2)]
- **build**: run embedtest with node\_g when BUILDTYPE=Debug (Chengzhong
Wu) [#&#8203;60850](https://redirect.github.com/nodejs/node/pull/60850)
-
\[[`9408c4459f`](https://redirect.github.com/nodejs/node/commit/9408c4459f)]
- **build**: upgrade Python linter ruff, add rules ASYNC,PERF (Christian
Clauss)
[#&#8203;59984](https://redirect.github.com/nodejs/node/pull/59984)
-
\[[`2166ec7f0f`](https://redirect.github.com/nodejs/node/commit/2166ec7f0f)]
- **build**: use call command when calling python configure (Jacob
Nichols)
[#&#8203;60098](https://redirect.github.com/nodejs/node/pull/60098)
-
\[[`73ef70145d`](https://redirect.github.com/nodejs/node/commit/73ef70145d)]
- **build**: remove V8\_COMPRESS\_POINTERS\_IN\_ISOLATE\_CAGE defs
(Joyee Cheung)
[#&#8203;60296](https://redirect.github.com/nodejs/node/pull/60296)
-
\[[`7b93a65f27`](https://redirect.github.com/nodejs/node/commit/7b93a65f27)]
- **build**: test on Python 3.14 (Christian Clauss)
[#&#8203;59983](https://redirect.github.com/nodejs/node/pull/59983)
-
\[[`508ce6ec6c`](https://redirect.github.com/nodejs/node/commit/508ce6ec6c)]
- **build, src**: fix include paths for vtune files (Rahul)
[#&#8203;59999](https://redirect.github.com/nodejs/node/pull/59999)
-
\[[`c89d3cd570`](https://redirect.github.com/nodejs/node/commit/c89d3cd570)]
- **build,tools**: fix addon build deadlock on errors (Vladimir Morozov)
[#&#8203;61321](https://redirect.github.com/nodejs/node/pull/61321)
-
\[[`40904a0591`](https://redirect.github.com/nodejs/node/commit/40904a0591)]
- **build,win**: update WinGet configurations to Python 3.14 (Mike
McCready)
[#&#8203;61431](https://redirect.github.com/nodejs/node/pull/61431)
-
\[[`6d6742e7db`](https://redirect.github.com/nodejs/node/commit/6d6742e7db)]
- **child\_process**: treat ipc length header as unsigned uint32 (Ryuhei
Shima)
[#&#8203;61344](https://redirect.github.com/nodejs/node/pull/61344)
-
\[[`6063d888fe`](https://redirect.github.com/nodejs/node/commit/6063d888fe)]
- **cli**: mark --heapsnapshot-near-heap-limit as stable (Joyee Cheung)
[#&#8203;60956](https://redirect.github.com/nodejs/node/pull/60956)
-
\[[`3d324a0f88`](https://redirect.github.com/nodejs/node/commit/3d324a0f88)]
- **cluster**: fix port reuse between cluster (Ryuhei Shima)
[#&#8203;60141](https://redirect.github.com/nodejs/node/pull/60141)
-
\[[`40a58709b4`](https://redirect.github.com/nodejs/node/commit/40a58709b4)]
- **console**: optimize single-string logging (Gürgün Dayıoğlu)
[#&#8203;60422](https://redirect.github.com/nodejs/node/pull/60422)
-
\[[`d950b151a2`](https://redirect.github.com/nodejs/node/commit/d950b151a2)]
- **crypto**: update root certificates to NSS 3.119 (Node.js GitHub Bot)
[#&#8203;61419](https://redirect.github.com/nodejs/node/pull/61419)
-
\[[`4f42f8c428`](https://redirect.github.com/nodejs/node/commit/4f42f8c428)]
- **crypto**: update root certificates to NSS 3.117 (Node.js GitHub Bot)
[#&#8203;60741](https://redirect.github.com/nodejs/node/pull/60741)
-
\[[`a87499ae25`](https://redirect.github.com/nodejs/node/commit/a87499ae25)]
- **crypto**: ensure documented RSA-PSS saltLength default is used
(Filip Skokan)
[#&#8203;60662](https://redirect.github.com/nodejs/node/pull/60662)
-
\[[`8c65cc11e2`](https://redirect.github.com/nodejs/node/commit/8c65cc11e2)]
- **crypto**: update root certificates to NSS 3.116 (Node.js GitHub Bot)
[#&#8203;59956](https://redirect.github.com/nodejs/node/pull/59956)
-
\[[`91dc00a2c1`](https://redirect.github.com/nodejs/node/commit/91dc00a2c1)]
- **debugger**: fix event listener leak in the run command (Joyee
Cheung)
[#&#8203;60464](https://redirect.github.com/nodejs/node/pull/60464)
-
\[[`0781bd3764`](https://redirect.github.com/nodejs/node/commit/0781bd3764)]
- **deps**: V8: backport
[`6a0a25a`](https://redirect.github.com/nodejs/node/commit/6a0a25abaed3)
(Vivian Wang)
[#&#8203;61688](https://redirect.github.com/nodejs/node/pull/61688)
-
\[[`0cf1f9c3e9`](https://redirect.github.com/nodejs/node/commit/0cf1f9c3e9)]
- **deps**: update googletest to
[`8508785`](85087857ad)
(Node.js GitHub Bot)
[#&#8203;61417](https://redirect.github.com/nodejs/node/pull/61417)
-
\[[`521b4b1f07`](https://redirect.github.com/nodejs/node/commit/521b4b1f07)]
- **deps**: update sqlite to 3.51.2 (Node.js GitHub Bot)
[#&#8203;61339](https://redirect.github.com/nodejs/node/pull/61339)
-
\[[`58b9d219a3`](https://redirect.github.com/nodejs/node/commit/58b9d219a3)]
- **deps**: update icu to 78.2 (Node.js GitHub Bot)
[#&#8203;60523](https://redirect.github.com/nodejs/node/pull/60523)
-
\[[`cbc1e4306d`](https://redirect.github.com/nodejs/node/commit/cbc1e4306d)]
- **deps**: update zlib to 1.3.1-e00f703 (Node.js GitHub Bot)
[#&#8203;61135](https://redirect.github.com/nodejs/node/pull/61135)
-
\[[`db59c35ed8`](https://redirect.github.com/nodejs/node/commit/db59c35ed8)]
- **deps**: update cjs-module-lexer to 2.2.0 (Node.js GitHub Bot)
[#&#8203;61271](https://redirect.github.com/nodejs/node/pull/61271)
-
\[[`c18518ee3c`](https://redirect.github.com/nodejs/node/commit/c18518ee3c)]
- **deps**: update nbytes to 0.1.2 (Node.js GitHub Bot)
[#&#8203;61270](https://redirect.github.com/nodejs/node/pull/61270)
-
\[[`376df62d63`](https://redirect.github.com/nodejs/node/commit/376df62d63)]
- **deps**: update timezone to 2025c (Node.js GitHub Bot)
[#&#8203;61138](https://redirect.github.com/nodejs/node/pull/61138)
-
\[[`993e905302`](https://redirect.github.com/nodejs/node/commit/993e905302)]
- **deps**: update simdjson to 4.2.4 (Node.js GitHub Bot)
[#&#8203;61056](https://redirect.github.com/nodejs/node/pull/61056)
-
\[[`b72fd2a5d3`](https://redirect.github.com/nodejs/node/commit/b72fd2a5d3)]
- **deps**: update googletest to
[`065127f`](065127f1e4)
(Node.js GitHub Bot)
[#&#8203;61055](https://redirect.github.com/nodejs/node/pull/61055)
-
\[[`d765147405`](https://redirect.github.com/nodejs/node/commit/d765147405)]
- **deps**: update sqlite to 3.51.1 (Node.js GitHub Bot)
[#&#8203;60899](https://redirect.github.com/nodejs/node/pull/60899)
-
\[[`37abe2a7d2`](https://redirect.github.com/nodejs/node/commit/37abe2a7d2)]
- **deps**: update zlib to 1.3.1-63d7e16 (Node.js GitHub Bot)
[#&#8203;60898](https://redirect.github.com/nodejs/node/pull/60898)
-
\[[`97241fcb86`](https://redirect.github.com/nodejs/node/commit/97241fcb86)]
- **deps**: update sqlite to 3.51.0 (Node.js GitHub Bot)
[#&#8203;60614](https://redirect.github.com/nodejs/node/pull/60614)
-
\[[`3669c7b4f4`](https://redirect.github.com/nodejs/node/commit/3669c7b4f4)]
- **deps**: update simdjson to 4.2.2 (Node.js GitHub Bot)
[#&#8203;60740](https://redirect.github.com/nodejs/node/pull/60740)
-
\[[`9a056ec89c`](https://redirect.github.com/nodejs/node/commit/9a056ec89c)]
- **deps**: update googletest to
[`1b96fa1`](1b96fa13f5)
(Node.js GitHub Bot)
[#&#8203;60739](https://redirect.github.com/nodejs/node/pull/60739)
-
\[[`b5803b3ea0`](https://redirect.github.com/nodejs/node/commit/b5803b3ea0)]
- **deps**: update minimatch to 10.1.1 (Node.js GitHub Bot)
[#&#8203;60543](https://redirect.github.com/nodejs/node/pull/60543)
-
\[[`5bf99f3d46`](https://redirect.github.com/nodejs/node/commit/5bf99f3d46)]
- **deps**: update cjs-module-lexer to 2.1.1 (Node.js GitHub Bot)
[#&#8203;60646](https://redirect.github.com/nodejs/node/pull/60646)
-
\[[`801f187357`](https://redirect.github.com/nodejs/node/commit/801f187357)]
- **deps**: update simdjson to 4.2.1 (Node.js GitHub Bot)
[#&#8203;60644](https://redirect.github.com/nodejs/node/pull/60644)
-
\[[`03c16e5a4c`](https://redirect.github.com/nodejs/node/commit/03c16e5a4c)]
- **deps**: update simdjson to 4.1.0 (Node.js GitHub Bot)
[#&#8203;60542](https://redirect.github.com/nodejs/node/pull/60542)
-
\[[`2ebfc2ca56`](https://redirect.github.com/nodejs/node/commit/2ebfc2ca56)]
- **deps**: update amaro to 1.1.5 (Node.js GitHub Bot)
[#&#8203;60541](https://redirect.github.com/nodejs/node/pull/60541)
-
\[[`d24ba4fed6`](https://redirect.github.com/nodejs/node/commit/d24ba4fed6)]
- **deps**: update simdjson to 4.0.7 (Node.js GitHub Bot)
[#&#8203;59883](https://redirect.github.com/nodejs/node/pull/59883)
-
\[[`9480a139bf`](https://redirect.github.com/nodejs/node/commit/9480a139bf)]
- **deps**: update googletest to
[`279f847`](https://redirect.github.com/nodejs/node/commit/279f847)
(Node.js GitHub Bot)
[#&#8203;60219](https://redirect.github.com/nodejs/node/pull/60219)
-
\[[`635e67379e`](https://redirect.github.com/nodejs/node/commit/635e67379e)]
- **deps**: update archs files for openssl-3.5.5 (Node.js GitHub Bot)
[#&#8203;61547](https://redirect.github.com/nodejs/node/pull/61547)
-
\[[`c7b774047d`](https://redirect.github.com/nodejs/node/commit/c7b774047d)]
- **deps**: upgrade openssl sources to openssl-3.5.5 (Node.js GitHub
Bot) [#&#8203;61547](https://redirect.github.com/nodejs/node/pull/61547)
-
\[[`5b324d7d7f`](https://redirect.github.com/nodejs/node/commit/5b324d7d7f)]
- **deps**: update corepack to 0.34.6 (Node.js GitHub Bot)
[#&#8203;61510](https://redirect.github.com/nodejs/node/pull/61510)
-
\[[`eef8ba0667`](https://redirect.github.com/nodejs/node/commit/eef8ba0667)]
- **deps**: update corepack to 0.34.5 (Node.js GitHub Bot)
[#&#8203;60842](https://redirect.github.com/nodejs/node/pull/60842)
-
\[[`490f7c7fb1`](https://redirect.github.com/nodejs/node/commit/490f7c7fb1)]
- **deps**: update corepack to 0.34.4 (Node.js GitHub Bot)
[#&#8203;60643](https://redirect.github.com/nodejs/node/pull/60643)
-
\[[`66903ea3b3`](https://redirect.github.com/nodejs/node/commit/66903ea3b3)]
- **deps**: update corepack to 0.34.2 (Node.js GitHub Bot)
[#&#8203;60550](https://redirect.github.com/nodejs/node/pull/60550)
-
\[[`a2f0b69282`](https://redirect.github.com/nodejs/node/commit/a2f0b69282)]
- **deps**: update corepack to 0.34.1 (Node.js GitHub Bot)
[#&#8203;60314](https://redirect.github.com/nodejs/node/pull/60314)
-
\[[`c8044a48a6`](https://redirect.github.com/nodejs/node/commit/c8044a48a6)]
- **deps**: V8: backport
[`2e4c5cf`](https://redirect.github.com/nodejs/node/commit/2e4c5cf9b112)
(Michaël Zasso)
[#&#8203;60654](https://redirect.github.com/nodejs/node/pull/60654)
-
\[[`642f518198`](https://redirect.github.com/nodejs/node/commit/642f518198)]
- **doc**: supported toolchain with Visual Studio 2022 only (Mike
McCready)
[#&#8203;61451](https://redirect.github.com/nodejs/node/pull/61451)
-
\[[`625f674487`](https://redirect.github.com/nodejs/node/commit/625f674487)]
- **doc**: move Security-Team from TSC to SECURITY (Rafael Gonzaga)
[#&#8203;61495](https://redirect.github.com/nodejs/node/pull/61495)
-
\[[`029e32f8ba`](https://redirect.github.com/nodejs/node/commit/029e32f8ba)]
- **doc**: added `requestOCSP` option to `tls.connect` (ikeyan)
[#&#8203;61064](https://redirect.github.com/nodejs/node/pull/61064)
-
\[[`68e33dfa89`](https://redirect.github.com/nodejs/node/commit/68e33dfa89)]
- **doc**: restore
[@&#8203;ChALkeR](https://redirect.github.com/ChALkeR) to collaborators
(Сковорода Никита Андреевич)
[#&#8203;61553](https://redirect.github.com/nodejs/node/pull/61553)
-
\[[`e016770d62`](https://redirect.github.com/nodejs/node/commit/e016770d62)]
- **doc**: update IBM/Red Hat volunteers with dedicated project time
(Beth Griggs)
[#&#8203;61588](https://redirect.github.com/nodejs/node/pull/61588)
-
\[[`ec63954657`](https://redirect.github.com/nodejs/node/commit/ec63954657)]
- **doc**: mention constructor comparison in assert.deepStrictEqual
(Hamza Kargin)
[#&#8203;60253](https://redirect.github.com/nodejs/node/pull/60253)
-
\[[`c8e1563a98`](https://redirect.github.com/nodejs/node/commit/c8e1563a98)]
- **doc**: add CVE delay mention (Rafael Gonzaga)
[#&#8203;61465](https://redirect.github.com/nodejs/node/pull/61465)
-
\[[`4b00cf2b54`](https://redirect.github.com/nodejs/node/commit/4b00cf2b54)]
- **doc**: include OpenJSF handle for security stewards (Rafael Gonzaga)
[#&#8203;61454](https://redirect.github.com/nodejs/node/pull/61454)
-
\[[`4b73bf5bc8`](https://redirect.github.com/nodejs/node/commit/4b73bf5bc8)]
- **doc**: clarify process.argv\[1] behavior for -e/--eval (Jeevankumar
S) [#&#8203;61366](https://redirect.github.com/nodejs/node/pull/61366)
-
\[[`d3151df4b3`](https://redirect.github.com/nodejs/node/commit/d3151df4b3)]
- **doc**: remove Windows Dev Home instructions from BUILDING (Mike
McCready)
[#&#8203;61434](https://redirect.github.com/nodejs/node/pull/61434)
-
\[[`2323462e35`](https://redirect.github.com/nodejs/node/commit/2323462e35)]
- **doc**: clarify TypedArray properties on Buffer (Roman Reiss)
[#&#8203;61355](https://redirect.github.com/nodejs/node/pull/61355)
-
\[[`6c5478c8b2`](https://redirect.github.com/nodejs/node/commit/6c5478c8b2)]
- **doc**: note resume build should not be done on node-test-commit
(Stewart X Addison)
[#&#8203;61373](https://redirect.github.com/nodejs/node/pull/61373)
-
\[[`ba4a043103`](https://redirect.github.com/nodejs/node/commit/ba4a043103)]
- **doc**: refine WebAssembly error documentation (sangwook)
[#&#8203;61382](https://redirect.github.com/nodejs/node/pull/61382)
-
\[[`cd315ea589`](https://redirect.github.com/nodejs/node/commit/cd315ea589)]
- **doc**: add deprecation history for url.parse (Eng Zer Jun)
[#&#8203;61389](https://redirect.github.com/nodejs/node/pull/61389)
-
\[[`42db0c392d`](https://redirect.github.com/nodejs/node/commit/42db0c392d)]
- **doc**: add marco and rafael in last sec release (Marco Ippolito)
[#&#8203;61383](https://redirect.github.com/nodejs/node/pull/61383)
-
\[[`4c3b680fc7`](https://redirect.github.com/nodejs/node/commit/4c3b680fc7)]
- **doc**: packages: example of private import switch to internal
(coderaiser)
[#&#8203;61343](https://redirect.github.com/nodejs/node/pull/61343)
-
\[[`684d15e421`](https://redirect.github.com/nodejs/node/commit/684d15e421)]
- **doc**: add esm and cjs examples to node:v8 (Alfredo González)
[#&#8203;61328](https://redirect.github.com/nodejs/node/pull/61328)
-
\[[`c3f9c7a7d9`](https://redirect.github.com/nodejs/node/commit/c3f9c7a7d9)]
- **doc**: added 'secure' event to tls.TLSSocket (ikeyan)
[#&#8203;61066](https://redirect.github.com/nodejs/node/pull/61066)
-
\[[`aa9acad5ca`](https://redirect.github.com/nodejs/node/commit/aa9acad5ca)]
- **doc**: restore
[@&#8203;watilde](https://redirect.github.com/watilde) to collaborators
(Daijiro Wachi)
[#&#8203;61350](https://redirect.github.com/nodejs/node/pull/61350)
-
\[[`9cafec084e`](https://redirect.github.com/nodejs/node/commit/9cafec084e)]
- **doc**: run license-builder (github-actions\[bot])
[#&#8203;61348](https://redirect.github.com/nodejs/node/pull/61348)
-
\[[`cdb12ccbc6`](https://redirect.github.com/nodejs/node/commit/cdb12ccbc6)]
- **doc**: document ALPNCallback option for TLSSocket constructor
(ikeyan)
[#&#8203;61331](https://redirect.github.com/nodejs/node/pull/61331)
-
\[[`461c5e65c5`](https://redirect.github.com/nodejs/node/commit/461c5e65c5)]
- **doc**: update MDN links (Livia Medeiros)
[#&#8203;61062](https://redirect.github.com/nodejs/node/pull/61062)
-
\[[`dde45baeab`](https://redirect.github.com/nodejs/node/commit/dde45baeab)]
- **doc**: add documentation for process.traceProcessWarnings (Alireza
Ebrahimkhani)
[#&#8203;53641](https://redirect.github.com/nodejs/node/pull/53641)
-
\[[`59a7aeec92`](https://redirect.github.com/nodejs/node/commit/59a7aeec92)]
- **doc**: fix filename typo (Hardanish Singh)
[#&#8203;61297](https://redirect.github.com/nodejs/node/pull/61297)
-
\[[`9a0a40d1ed`](https://redirect.github.com/nodejs/node/commit/9a0a40d1ed)]
- **doc**: fix typos and grammar in `BUILDING.md` & `onboarding.md`
(Hardanish Singh)
[#&#8203;61267](https://redirect.github.com/nodejs/node/pull/61267)
-
\[[`dca7005f9d`](https://redirect.github.com/nodejs/node/commit/dca7005f9d)]
- **doc**: mention --newVersion release script (Rafael Gonzaga)
[#&#8203;61255](https://redirect.github.com/nodejs/node/pull/61255)
-
\[[`c0dc8ddf85`](https://redirect.github.com/nodejs/node/commit/c0dc8ddf85)]
- **doc**: correct typo in api contributing doc (Mike McCready)
[#&#8203;61260](https://redirect.github.com/nodejs/node/pull/61260)
-
\[[`066af38fe1`](https://redirect.github.com/nodejs/node/commit/066af38fe1)]
- **doc**: add PR-URL requirement for security backports (Rafael
Gonzaga)
[#&#8203;61256](https://redirect.github.com/nodejs/node/pull/61256)
-
\[[`71dd46bd0c`](https://redirect.github.com/nodejs/node/commit/71dd46bd0c)]
- **doc**: add reusePort error behavior to net module (mag123c)
[#&#8203;61250](https://redirect.github.com/nodejs/node/pull/61250)
-
\[[`f6abe3ba33`](https://redirect.github.com/nodejs/node/commit/f6abe3ba33)]
- **doc**: note corepack package removal in distribution doc (Mike
McCready)
[#&#8203;61207](https://redirect.github.com/nodejs/node/pull/61207)
-
\[[`9059d49d8c`](https://redirect.github.com/nodejs/node/commit/9059d49d8c)]
- **doc**: fix tls.connect() timeout documentation (Azad Gupta)
[#&#8203;61079](https://redirect.github.com/nodejs/node/pull/61079)
-
\[[`e7b34b76b0`](https://redirect.github.com/nodejs/node/commit/e7b34b76b0)]
- **doc**: missing `passed`, `error` and `passed` properties on
`TestContext` (Xavier Stouder)
[#&#8203;61185](https://redirect.github.com/nodejs/node/pull/61185)
-
\[[`9ae2dcfbb6`](https://redirect.github.com/nodejs/node/commit/9ae2dcfbb6)]
- **doc**: clarify threat model for application-level API exposure
(Rafael Gonzaga)
[#&#8203;61184](https://redirect.github.com/nodejs/node/pull/61184)
-
\[[`9902331a7c`](https://redirect.github.com/nodejs/node/commit/9902331a7c)]
- **doc**: correct options for net.Socket class and socket.connect
(Xavier Stouder)
[#&#8203;61179](https://redirect.github.com/nodejs/node/pull/61179)
-
\[[`a80122d2fe`](https://redirect.github.com/nodejs/node/commit/a80122d2fe)]
- **doc**: document error event on readline InterfaceConstructor (Xavier
Stouder)
[#&#8203;61170](https://redirect.github.com/nodejs/node/pull/61170)
-
\[[`38d73c9cfa`](https://redirect.github.com/nodejs/node/commit/38d73c9cfa)]
- **doc**: add a smooth scrolling effect to the sidebar (btea)
[#&#8203;59007](https://redirect.github.com/nodejs/node/pull/59007)
-
\[[`95c51fa984`](https://redirect.github.com/nodejs/node/commit/95c51fa984)]
- **doc**: correct invalid collaborator profile (JJ)
[#&#8203;61091](https://redirect.github.com/nodejs/node/pull/61091)
-
\[[`f5a044763c`](https://redirect.github.com/nodejs/node/commit/f5a044763c)]
- **doc**: exclude compile-time flag features from security policy
(Matteo Collina)
[#&#8203;61109](https://redirect.github.com/nodejs/node/pull/61109)
-
\[[`b6ebf2cd53`](https://redirect.github.com/nodejs/node/commit/b6ebf2cd53)]
- **doc**: add
[@&#8203;avivkeller](https://redirect.github.com/avivkeller) to
collaborators (Aviv Keller)
[#&#8203;61115](https://redirect.github.com/nodejs/node/pull/61115)
-
\[[`35854f424d`](https://redirect.github.com/nodejs/node/commit/35854f424d)]
- **doc**: add gurgunday to collaborators (Gürgün Dayıoğlu)
[#&#8203;61094](https://redirect.github.com/nodejs/node/pull/61094)
-
\[[`4932322c29`](https://redirect.github.com/nodejs/node/commit/4932322c29)]
- **doc**: add File modes cross-references in fs methods (Mohit Raj
Saxena)
[#&#8203;60286](https://redirect.github.com/nodejs/node/pull/60286)
-
\[[`c84904e047`](https://redirect.github.com/nodejs/node/commit/c84904e047)]
- **doc**: add missing `zstd` to mjs example of zlib (Deokjin Kim)
[#&#8203;60915](https://redirect.github.com/nodejs/node/pull/60915)
-
\[[`e615b9e2f2`](https://redirect.github.com/nodejs/node/commit/e615b9e2f2)]
- **doc**: clarify fileURLToPath security considerations (Rafael
Gonzaga)
[#&#8203;60887](https://redirect.github.com/nodejs/node/pull/60887)
-
\[[`99e384e6d4`](https://redirect.github.com/nodejs/node/commit/99e384e6d4)]
- **doc**: replace column with columnNumber in example of
`util.getCallSites` (Deokjin Kim)
[#&#8203;60881](https://redirect.github.com/nodejs/node/pull/60881)
-
\[[`9351bb4d02`](https://redirect.github.com/nodejs/node/commit/9351bb4d02)]
- **doc**: correct spelling in BUILDING.md (Rich Trott)
[#&#8203;60875](https://redirect.github.com/nodejs/node/pull/60875)
-
\[[`e1f6e7fc4d`](https://redirect.github.com/nodejs/node/commit/e1f6e7fc4d)]
- **doc**: update debuglog examples to use 'foo-bar' instead of 'foo'
(xiaoyao)
[#&#8203;60867](https://redirect.github.com/nodejs/node/pull/60867)
-
\[[`ccbb2d7300`](https://redirect.github.com/nodejs/node/commit/ccbb2d7300)]
- **doc**: fix typos in changelogs (Rich Trott)
[#&#8203;60855](https://redirect.github.com/nodejs/node/pull/60855)
-
\[[`1cb2fe8b35`](https://redirect.github.com/nodejs/node/commit/1cb2fe8b35)]
- **doc**: mark module.register as active development (Chengzhong Wu)
[#&#8203;60849](https://redirect.github.com/nodejs/node/pull/60849)
-
\[[`ceeb4968a6`](https://redirect.github.com/nodejs/node/commit/ceeb4968a6)]
- **doc**: add fullName property to SuiteContext (PaulyBearCoding)
[#&#8203;60762](https://redirect.github.com/nodejs/node/pull/60762)
-
\[[`56155909dd`](https://redirect.github.com/nodejs/node/commit/56155909dd)]
- **doc**: keep sidebar module visible when navigating docs (Botato)
[#&#8203;60410](https://redirect.github.com/nodejs/node/pull/60410)
-
\[[`6b637763d5`](https://redirect.github.com/nodejs/node/commit/6b637763d5)]
- **doc**: correct concurrency wording in test() documentation (Azad
Gupta)
[#&#8203;60773](https://redirect.github.com/nodejs/node/pull/60773)
-
\[[`7183e8ffa1`](https://redirect.github.com/nodejs/node/commit/7183e8ffa1)]
- **doc**: clarify that CQ only picks up PRs targeting `main` (René)
[#&#8203;60731](https://redirect.github.com/nodejs/node/pull/60731)
-
\[[`d5d94303be`](https://redirect.github.com/nodejs/node/commit/d5d94303be)]
- **doc**: clarify license section and add contributor note
(KaleruMadhu)
[#&#8203;60590](https://redirect.github.com/nodejs/node/pull/60590)
-
\[[`e0210c8f53`](https://redirect.github.com/nodejs/node/commit/e0210c8f53)]
- **doc**: correct tls ALPNProtocols types (René)
[#&#8203;60143](https://redirect.github.com/nodejs/node/pull/60143)
-
\[[`eff87b498a`](https://redirect.github.com/nodejs/node/commit/eff87b498a)]
- **doc**: remove mention of SMS 2FA (Antoine du Hamel)
[#&#8203;60707](https://redirect.github.com/nodejs/node/pull/60707)
-
\[[`e77ef94a51`](https://redirect.github.com/nodejs/node/commit/e77ef94a51)]
- **doc**: `domain.add()` does not accept timer objects (René)
[#&#8203;60675](https://redirect.github.com/nodejs/node/pull/60675)
-
\[[`4fe19c95ea`](https://redirect.github.com/nodejs/node/commit/4fe19c95ea)]
- **doc**: update Collaborators list to reflect hybrist handle change
(Antoine du Hamel)
[#&#8203;60650](https://redirect.github.com/nodejs/node/pull/60650)
-
\[[`eece59b6ce`](https://redirect.github.com/nodejs/node/commit/eece59b6ce)]
- **doc**: fix linter issues (Antoine du Hamel)
[#&#8203;60636](https://redirect.github.com/nodejs/node/pull/60636)
-
\[[`6e17e596e4`](https://redirect.github.com/nodejs/node/commit/6e17e596e4)]
- **doc**: correct values/references for buffer.kMaxLength (René)
[#&#8203;60305](https://redirect.github.com/nodejs/node/pull/60305)
-
\[[`ac327ae9a7`](https://redirect.github.com/nodejs/node/commit/ac327ae9a7)]
- **doc**: recommend events.once to manage 'close' event (Dan Fabulich)
[#&#8203;60017](https://redirect.github.com/nodejs/node/pull/60017)
-
\[[`d9b149ea42`](https://redirect.github.com/nodejs/node/commit/d9b149ea42)]
- **doc**: highlight module loading difference between import and
require (Ajay A)
[#&#8203;59815](https://redirect.github.com/nodejs/node/pull/59815)
-
\[[`f6d62cb22c`](https://redirect.github.com/nodejs/node/commit/f6d62cb22c)]
- **doc**: fix typo in `process.unref` documentation (우혁)
[#&#8203;59698](https://redirect.github.com/nodejs/node/pull/59698)
-
\[[`6d5078b196`](https://redirect.github.com/nodejs/node/commit/6d5078b196)]
- **doc**: add some entries to `glossary.md` (Mohataseem Khan)
[#&#8203;59277](https://redirect.github.com/nodejs/node/pull/59277)
-
\[[`b0a5820dea`](https://redirect.github.com/nodejs/node/commit/b0a5820dea)]
- **doc**: improve agent.createConnection docs for http and https agents
(JaeHo Jang)
[#&#8203;58205](https://redirect.github.com/nodejs/node/pull/58205)
-
\[[`b5db02fe67`](https://redirect.github.com/nodejs/node/commit/b5db02fe67)]
- **doc**: fix pseudo code in modules.md (chirsz)
[#&#8203;57677](https://redirect.github.com/nodejs/node/pull/57677)
-
\[[`e9b912d481`](https://redirect.github.com/nodejs/node/commit/e9b912d481)]
- **doc**: add missing variable in code snippet (Koushil Mankali)
[#&#8203;55478](https://redirect.github.com/nodejs/node/pull/55478)
-
\[[`44c06c7812`](https://redirect.github.com/nodejs/node/commit/44c06c7812)]
- **doc**: add missing word in `single-executable-applications.md`
(Konstantin Tsabolov)
[#&#8203;53864](https://redirect.github.com/nodejs/node/pull/53864)
-
\[[`482b43f160`](https://redirect.github.com/nodejs/node/commit/482b43f160)]
- **doc**: fix typo in http.md (Michael Solomon)
[#&#8203;59354](https://redirect.github.com/nodejs/node/pull/59354)
-
\[[`cd323bc718`](https://redirect.github.com/nodejs/node/commit/cd323bc718)]
- **doc**: update devcontainer.json and add documentation (Joyee Cheung)
[#&#8203;60472](https://redirect.github.com/nodejs/node/pull/60472)
-
\[[`c7c70f3a16`](https://redirect.github.com/nodejs/node/commit/c7c70f3a16)]
- **doc**: add haramj as triager (Haram Jeong)
[#&#8203;60348](https://redirect.github.com/nodejs/node/pull/60348)
-
\[[`04b8c4d14e`](https://redirect.github.com/nodejs/node/commit/04b8c4d14e)]
- **doc**: clarify require(esm) description (dynst)
[#&#8203;60520](https://redirect.github.com/nodejs/node/pull/60520)
-
\[[`de382dc832`](https://redirect.github.com/nodejs/node/commit/de382dc832)]
- **doc**: instantiate resolver object (Donghoon Nam)
[#&#8203;60476](https://redirect.github.com/nodejs/node/pull/60476)
-
\[[`b6845ce460`](https://redirect.github.com/nodejs/node/commit/b6845ce460)]
- **doc**: clarify --use-system-ca support status (Joyee Cheung)
[#&#8203;60340](https://redirect.github.com/nodejs/node/pull/60340)
-
\[[`0894dae9bc`](https://redirect.github.com/nodejs/node/commit/0894dae9bc)]
- **doc**: add missing CAA type to dns.resolveAny() &
dnsPromises.resolveAny() (Jimmy Leung)
[#&#8203;58899](https://redirect.github.com/nodejs/node/pull/58899)
-
\[[`c86a69f692`](https://redirect.github.com/nodejs/node/commit/c86a69f692)]
- **doc**: use `any` for `worker_threads.Worker` 'error' event argument
`err` (Jonas Geiler)
[#&#8203;60300](https://redirect.github.com/nodejs/node/pull/60300)
-
\[[`0c5031e233`](https://redirect.github.com/nodejs/node/commit/0c5031e233)]
- **doc**: update decorator documentation to reflect actual policy
(Muhammad Salman Aziz)
[#&#8203;60288](https://redirect.github.com/nodejs/node/pull/60288)
-
\[[`b01f710175`](https://redirect.github.com/nodejs/node/commit/b01f710175)]
- **doc**: document wildcard supported by tools/test.py (Joyee Cheung)
[#&#8203;60265](https://redirect.github.com/nodejs/node/pull/60265)
-
\[[`b4524dabcc`](https://redirect.github.com/nodejs/node/commit/b4524dabcc)]
- **doc**: fix `blob.bytes()` heading level (XTY)
[#&#8203;60252](https://redirect.github.com/nodejs/node/pull/60252)
-
\[[`5df02776e3`](https://redirect.github.com/nodejs/node/commit/5df02776e3)]
- **doc**: fix not working code example in vm docs (Artur Gawlik)
[#&#8203;60224](https://redirect.github.com/nodejs/node/pull/60224)
-
\[[`6a4359a0b5`](https://redirect.github.com/nodejs/node/commit/6a4359a0b5)]
- **doc**: improve code snippet alternative of url.parse() using WHATWG
URL (Steven)
[#&#8203;60209](https://redirect.github.com/nodejs/node/pull/60209)
-
\[[`ad06bee70d`](https://redirect.github.com/nodejs/node/commit/ad06bee70d)]
- **doc**: use markdown when branch-diff major release (Rafael Gonzaga)
[#&#8203;60179](https://redirect.github.com/nodejs/node/pull/60179)
-
\[[`c0d4b11ed4`](https://redirect.github.com/nodejs/node/commit/c0d4b11ed4)]
- **doc**: update teams in collaborator-guide.md and add links (Bart
Louwers)
[#&#8203;60065](https://redirect.github.com/nodejs/node/pull/60065)
-
\[[`20b5ffcac3`](https://redirect.github.com/nodejs/node/commit/20b5ffcac3)]
- **doc**: update previous version links in BUILDING (Mike McCready)
[#&#8203;61457](https://redirect.github.com/nodejs/node/pull/61457)
-
\[[`de345ea3a3`](https://redirect.github.com/nodejs/node/commit/de345ea3a3)]
- **doc**: correct description of `error.stack` accessor behavior (René)
[#&#8203;61090](https://redirect.github.com/nodejs/node/pull/61090)
-
\[[`d8418d9de7`](https://redirect.github.com/nodejs/node/commit/d8418d9de7)]
- **doc**: fix link in `--env-file=file` section (N. Bighetti)
[#&#8203;60563](https://redirect.github.com/nodejs/node/pull/60563)
-
\[[`1107bda21e`](https://redirect.github.com/nodejs/node/commit/1107bda21e)]
- **doc**: fix v22 changelog after security release (Marco Ippolito)
[#&#8203;61371](https://redirect.github.com/nodejs/node/pull/61371)
-
\[[`42aab9469a`](https://redirect.github.com/nodejs/node/commit/42aab9469a)]
- **doc**: add missing history entry for `sqlite.md` (Antoine du Hamel)
[#&#8203;60607](https://redirect.github.com/nodejs/node/pull/60607)
-
\[[`deb6d5deff`](https://redirect.github.com/nodejs/node/commit/deb6d5deff)]
- **doc, module**: change async customization hooks to experimental
(Gerhard Stöbich)
[#&#8203;60302](https://redirect.github.com/nodejs/node/pull/60302)
-
\[[`c659add7d1`](https://redirect.github.com/nodejs/node/commit/c659add7d1)]
- **doc,src,lib**: clarify experimental status of Web Storage support
(Antoine du Hamel)
[#&#8203;60708](https://redirect.github.com/nodejs/node/pull/60708)
-
\[[`dda95e91b9`](https://redirect.github.com/nodejs/node/commit/dda95e91b9)]
- **esm**: avoid throw when module specifier is not url (Craig Macomber
(Microsoft))
[#&#8203;61000](https://redirect.github.com/nodejs/node/pull/61000)
-
\[[`912945be89`](https://redirect.github.com/nodejs/node/commit/912945be89)]
- **events**: remove redundant todo (Gürgün Dayıoğlu)
[#&#8203;60595](https://redirect.github.com/nodejs/node/pull/60595)
-
\[[`22e156eb10`](https://redirect.github.com/nodejs/node/commit/22e156eb10)]
- **events**: remove eventtarget custom inspect branding (Efe)
[#&#8203;61128](https://redirect.github.com/nodejs/node/pull/61128)
-
\[[`df6fd9b03f`](https://redirect.github.com/nodejs/node/commit/df6fd9b03f)]
- **fs**: remove duplicate getValidatedPath calls (Mert Can Altin)
[#&#8203;61359](https://redirect.github.com/nodejs/node/pull/61359)
-
\[[`6ea3e4d850`](https://redirect.github.com/nodejs/node/commit/6ea3e4d850)]
- **fs**: fix errorOnExist behavior for directory copy in fs.cp
(Nicholas Paun)
[#&#8203;60946](https://redirect.github.com/nodejs/node/pull/60946)
-
\[[`dd918b9980`](https://redirect.github.com/nodejs/node/commit/dd918b9980)]
- **fs**: fix ENOTDIR in globSync when file is treated as dir (sangwook)
[#&#8203;61259](https://redirect.github.com/nodejs/node/pull/61259)
-
\[[`4908e67ba0`](https://redirect.github.com/nodejs/node/commit/4908e67ba0)]
- **fs**: remove duplicate fd validation in sync functions (Mert Can
Altin)
[#&#8203;61361](https://redirect.github.com/nodejs/node/pull/61361)
-
\[[`4a27bce3d9`](https://redirect.github.com/nodejs/node/commit/4a27bce3d9)]
- **fs**: detect dot files when using globstar (Robin van Wijngaarden)
[#&#8203;61012](https://redirect.github.com/nodejs/node/pull/61012)
-
\[[`b0186ff65c`](https://redirect.github.com/nodejs/node/commit/b0186ff65c)]
- **fs**: validate statfs path (Efe)
[#&#8203;61230](https://redirect.github.com/nodejs/node/pull/61230)
-
\[[`6689775023`](https://redirect.github.com/nodejs/node/commit/6689775023)]
- **gyp**: aix: change gcc version detection so CXX="ccache g++" works
(Stewart X Addison)
[#&#8203;61464](https://redirect.github.com/nodejs/node/pull/61464)
-
\[[`5c4f4db663`](https://redirect.github.com/nodejs/node/commit/5c4f4db663)]
- **http**: fix rawHeaders exceeding maxHeadersCount limit (Max Harari)
[#&#8203;61285](https://redirect.github.com/nodejs/node/pull/61285)
-
\[[`7599e2eccd`](https://redirect.github.com/nodejs/node/commit/7599e2eccd)]
- **http**: replace startsWith with strict equality (btea)
[#&#8203;59394](https://redirect.github.com/nodejs/node/pull/59394)
-
\[[`99a85213bf`](https://redirect.github.com/nodejs/node/commit/99a85213bf)]
- **http**: lazy allocate cookies array (Robert Nagy)
[#&#8203;59734](https://redirect.github.com/nodejs/node/pull/59734)
-
\[[`7669e6a5ad`](https://redirect.github.com/nodejs/node/commit/7669e6a5ad)]
- **http**: fix http client leaky with double response (theanarkh)
[#&#8203;60062](https://redirect.github.com/nodejs/node/pull/60062)
-
\[[`f074c126a8`](https://redirect.github.com/nodejs/node/commit/f074c126a8)]
- **http,https**: fix double ERR\_PROXY\_TUNNEL emission (Shima Ryuhei)
[#&#8203;60699](https://redirect.github.com/nodejs/node/pull/60699)
-
\[[`d8ac368363`](https://redirect.github.com/nodejs/node/commit/d8ac368363)]
- **http2**: add diagnostics channels for client stream request body
(Darshan Sen)
[#&#8203;60480](https://redirect.github.com/nodejs/node/pull/60480)
-
\[[`e26a7e464d`](https://redirect.github.com/nodejs/node/commit/e26a7e464d)]
- **http2**: rename variable to additionalPseudoHeaders (Tobias Nießen)
[#&#8203;60208](https://redirect.github.com/nodejs/node/pull/60208)
-
\[[`5df634f46e`](https://redirect.github.com/nodejs/node/commit/5df634f46e)]
- **http2**: validate initialWindowSize per HTTP/2 spec (Matteo Collina)
[#&#8203;61402](https://redirect.github.com/nodejs/node/pull/61402)
-
\[[`2ccc9a6205`](https://redirect.github.com/nodejs/node/commit/2ccc9a6205)]
- **http2**: do not crash on mismatched ping buffer length (René)
[#&#8203;60135](https://redirect.github.com/nodejs/node/pull/60135)
-
\[[`3e68a5f78a`](https://redirect.github.com/nodejs/node/commit/3e68a5f78a)]
- **inspector**: inspect HTTP response body (Chengzhong Wu)
[#&#8203;60572](https://redirect.github.com/nodejs/node/pull/60572)
-
\[[`a86ffa9a5d`](https://redirect.github.com/nodejs/node/commit/a86ffa9a5d)]
- **inspector**: add network payload buffer size limits (Chengzhong Wu)
[#&#8203;60236](https://redirect.github.com/nodejs/node/pull/60236)
-
\[[`ea60ef5d74`](https://redirect.github.com/nodejs/node/commit/ea60ef5d74)]
- **lib**: fix typo in `util.js` comment (Taejin Kim)
[#&#8203;61365](https://redirect.github.com/nodejs/node/pull/61365)
-
\[[`9d8d9322a4`](https://redirect.github.com/nodejs/node/commit/9d8d9322a4)]
- **lib**: fix TypeScript support check in jitless mode (sangwook)
[#&#8203;61382](https://redirect.github.com/nodejs/node/pull/61382)
-
\[[`fc26f5c78f`](https://redirect.github.com/nodejs/node/commit/fc26f5c78f)]
- **lib**: gbk decoder is gb18030 decoder per spec (Сковорода Никита
Андреевич)
[#&#8203;61099](https://redirect.github.com/nodejs/node/pull/61099)
-
\[[`3b87030012`](https://redirect.github.com/nodejs/node/commit/3b87030012)]
- **lib**: enforce use of `URLParse` (Antoine du Hamel)
[#&#8203;61016](https://redirect.github.com/nodejs/node/pull/61016)
-
\[[`2a7479d4fc`](https://redirect.github.com/nodejs/node/commit/2a7479d4fc)]
- **lib**: use `FastBuffer` for empty buffer allocation (Gürgün
Dayıoğlu)
[#&#8203;60558](https://redirect.github.com/nodejs/node/pull/60558)
-
\[[`7cf4c43582`](https://redirect.github.com/nodejs/node/commit/7cf4c43582)]
- **lib**: fix constructor in \_errnoException stack tree (SeokHun)
[#&#8203;60156](https://redirect.github.com/nodejs/node/pull/60156)
-
\[[`f9d87fbfaa`](https://redirect.github.com/nodejs/node/commit/f9d87fbfaa)]
- **lib**: fix typo in QuicSessionStats (SeokHun)
[#&#8203;60155](https://redirect.github.com/nodejs/node/pull/60155)
-
\[[`8d26ccc652`](https://redirect.github.com/nodejs/node/commit/8d26ccc652)]
- **lib**: remove redundant destroyHook checks (Gürgün Dayıoğlu)
[#&#8203;60120](https://redirect.github.com/nodejs/node/pull/60120)
-
\[[`705832a1be`](https://redirect.github.com/nodejs/node/commit/705832a1be)]
- **lib,src**: isInsideNodeModules should test on the first non-internal
frame (Chengzhong Wu)
[#&#8203;60991](https://redirect.github.com/nodejs/node/pull/60991)
-
\[[`6f39ad190b`](https://redirect.github.com/nodejs/node/commit/6f39ad190b)]
- **meta**: do not fast-track npm updates (Antoine du Hamel)
[#&#8203;61475](https://redirect.github.com/nodejs/node/pull/61475)
-
\[[`a6a0ff9486`](https://redirect.github.com/nodejs/node/commit/a6a0ff9486)]
- **meta**: fix typos in issue template config (Daijiro Wachi)
[#&#8203;61399](https://redirect.github.com/nodejs/node/pull/61399)
-
\[[`ec88c9b378`](https://redirect.github.com/nodejs/node/commit/ec88c9b378)]
- **meta**: label v8 module PRs (René)
[#&#8203;61325](https://redirect.github.com/nodejs/node/pull/61325)
-
\[[`83143835de`](https://redirect.github.com/nodejs/node/commit/83143835de)]
- **meta**: bump step-security/harden-runner from 2.13.2 to 2.14.0
(dependabot\[bot])
[#&#8203;61245](https://redirect.github.com/nodejs/node/pull/61245)
-
\[[`0802dc663a`](https://redirect.github.com/nodejs/node/commit/0802dc663a)]
- **meta**: bump actions/setup-node from 6.0.0 to 6.1.0
(dependabot\[bot])
[#&#8203;61244](https://redirect.github.com/nodejs/node/pull/61244)
-
\[[`587db55796`](https://redirect.github.com/nodejs/node/commit/587db55796)]
- **meta**: bump actions/cache from 4.3.0 to 5.0.1 (dependabot\[bot])
[#&#8203;61243](https://redirect.github.com/nodejs/node/pull/61243)
-
\[[`262c9d37a6`](https://redirect.github.com/nodejs/node/commit/262c9d37a6)]
- **meta**: bump github/codeql-action from 4.31.6 to 4.31.9
(dependabot\[bot])
[#&#8203;61241](https://redirect.github.com/nodejs/node/pull/61241)
-
\[[`d9763b5afd`](https://redirect.github.com/nodejs/node/commit/d9763b5afd)]
- **meta**: bump codecov/codecov-action from 5.5.1 to 5.5.2
(dependabot\[bot])
[#&#8203;61240](https://redirect.github.com/nodejs/node/pull/61240)
-
\[[`0af73d1811`](https://redirect.github.com/nodejs/node/commit/0af73d1811)]
- **meta**: bump peter-evans/create-pull-request from 7.0.9 to 8.0.0
(dependabot\[bot])
[#&#8203;61237](https://redirect.github.com/nodejs/node/pull/61237)
-
\[[`8be6afd239`](https://redirect.github.com/nodejs/node/commit/8be6afd239)]
- **meta**: move lukekarrys to emeritus (Node.js GitHub Bot)
[#&#8203;60985](https://redirect.github.com/nodejs/node/pull/60985)
-
\[[`c497de5c74`](https://redirect.github.com/nodejs/node/commit/c497de5c74)]
- **meta**: bump actions/setup-python from 6.0.0 to 6.1.0
(dependabot\[bot])
[#&#8203;60927](https://redirect.github.com/nodejs/node/pull/60927)
-
\[[`774920f169`](https://redirect.github.com/nodejs/node/commit/774920f169)]
- **meta**: bump github/codeql-action from 4.31.3 to 4.31.6
(dependabot\[bot])
[#&#8203;60926](https://redirect.github.com/nodejs/node/pull/60926)
-
\[[`ef3b1e5991`](https://redirect.github.com/nodejs/node/commit/ef3b1e5991)]
- **meta**: bump peter-evans/create-pull-request from 7.0.8 to 7.0.9
(dependabot\[bot])
[#&#8203;60924](https://redirect.github.com/nodejs/node/pull/60924)
-
\[[`3ed667379f`](https://redirect.github.com/nodejs/node/commit/3ed667379f)]
- **meta**: bump github/codeql-action from 4.31.2 to 4.31.3
(dependabot\[bot])
[#&#8203;60770](https://redirect.github.com/nodejs/node/pull/60770)
-
\[[`7c0cefb126`](https://redirect.github.com/nodejs/node/commit/7c0cefb126)]
- **meta**: bump step-security/harden-runner from 2.13.1 to 2.13.2
(dependabot\[bot])
[#&#8203;60769](https://redirect.github.com/nodejs/node/pull/60769)
-
\[[`5c6a076e5d`](https://redirect.github.com/nodejs/node/commit/5c6a076e5d)]
- **meta**: add Renegade334 to collaborators (Renegade334)
[#&#8203;60714](https://redirect.github.com/nodejs/node/pull/60714)
-
\[[`4f4dda2a18`](https://redirect.github.com/nodejs/node/commit/4f4dda2a18)]
- **meta**: bump actions/download-artifact from 5.0.0 to 6.0.0
(dependabot\[bot])
[#&#8203;60532](https://redirect.github.com/nodejs/node/pull/60532)
-
\[[`c436f8d57c`](https://redirect.github.com/nodejs/node/commit/c436f8d57c)]
- **meta**: bump actions/upload-artifact from 4.6.2 to 5.0.0
(dependabot\[bot])
[#&#8203;60531](https://redirect.github.com/nodejs/node/pull/60531)
-
\[[`402d9f87a6`](https://redirect.github.com/nodejs/node/commit/402d9f87a6)]
- **meta**: bump github/codeql-action from 3.30.5 to 4.31.2
(dependabot\[bot])
[#&#8203;60533](https://redirect.github.com/nodejs/node/pull/60533)
-
\[[`61be78e326`](https://redirect.github.com/nodejs/node/commit/61be78e326)]
- **meta**: bump actions/setup-node from 5.0.0 to 6.0.0
(dependabot\[bot])
[#&#8203;60529](https://redirect.github.com/nodejs/node/pull/60529)
-
\[[`7e4164a623`](https://redirect.github.com/nodejs/node/commit/7e4164a623)]
- **meta**: bump actions/stale from 10.0.0 to 10.1.0 (dependabot\[bot])
[#&#8203;60528](https://redirect.github.com/nodejs/node/pull/60528)
-
\[[`1bf6e1d010`](https://redirect.github.com/nodejs/node/commit/1bf6e1d010)]
- **meta**: move one or more collaborators to emeritus (Node.js GitHub
Bot) [#&#8203;60325](https://redirect.github.com/nodejs/node/pull/60325)
-
\[[`c66fc0e9cf`](https://redirect.github.com/nodejs/node/commit/c66fc0e9cf)]
- **meta**: loop userland-migrations in deprecations (Chengzhong Wu)
[#&#8203;60299](https://redirect.github.com/nodejs/node/pull/60299)
-
\[[`e4be0791e7`](https://redirect.github.com/nodejs/node/commit/e4be0791e7)]
- **meta**: call `create-release-post.yml` post release (Aviv Keller)
[#&#8203;60366](https://redirect.github.com/nodejs/node/pull/60366)
-
\[[`8674f6527f`](https://redirect.github.com/nodejs/node/commit/8674f6527f)]
- **module**: preserve URL in the parent created by createRequire()
(Joyee Cheung)
[#&#8203;60974](https://redirect.github.com/nodejs/node/pull/60974)
-
\[[`41db87a975`](https://redirect.github.com/nodejs/node/commit/41db87a975)]
- **msi**: fix WiX warnings (Stefan Stojanovic)
[#&#8203;60251](https://redirect.github.com/nodejs/node/pull/60251)
-
\[[`884f313f40`](https://redirect.github.com/nodejs/node/commit/884f313f40)]
- **node-api**: use Node-API in comments (Vladimir Morozov)
[#&#8203;61320](https://redirect.github.com/nodejs/node/pull/61320)
-
\[[`375164190b`](https://redirect.github.com/nodejs/node/commit/375164190b)]
- **node-api**: use local files for instanceof test (Vladimir Morozov)
[#&#8203;60190](https://redirect.github.com/nodejs/node/pull/60190)
-
\[[`972a1107c0`](https://redirect.github.com/nodejs/node/commit/972a1107c0)]
- **os**: freeze signals constant (Xavier Stouder)
[#&#8203;61038](https://redirect.github.com/nodejs/node/pull/61038)
-
\[[`e992057ab7`](https://redirect.github.com/nodejs/node/commit/e992057ab7)]
- **perf\_hooks**: fix stack overflow error (Antoine du Hamel)
[#&#8203;60084](https://redirect.github.com/nodejs/node/pull/60084)
-
\[[`0bb1814fdf`](https://redirect.github.com/nodejs/node/commit/0bb1814fdf)]
- **repl**: fix pasting after moving the cursor to the left (Ruben
Bridgewater)
[#&#8203;60470](https://redirect.github.com/nodejs/node/pull/60470)
-
\[[`35a12fb996`](https://redirect.github.com/nodejs/node/commit/35a12fb996)]
- **src**: replace `ranges::sort` for libc++13 compatibility on armhf
(Rebroad)
[#&#8203;61789](https://redirect.github.com/nodejs/node/pull/61789)
-
\[[`dbf00d4664`](https://redirect.github.com/nodejs/node/commit/dbf00d4664)]
- **src**: add missing override specifier to Clean() (Tobias Nießen)
[#&#8203;61429](https://redirect.github.com/nodejs/node/pull/61429)
-
\[[`140eba35d3`](https://redirect.github.com/nodejs/node/commit/140eba35d3)]
- **src**: cache context lookup in vectored io loops (Mert Can Altin)
[#&#8203;61387](https://redirect.github.com/nodejs/node/pull/61387)
-
\[[`93e7e1708b`](https://redirect.github.com/nodejs/node/commit/93e7e1708b)]
- **src**: use C++ nullptr in webstorage (Tobias Nießen)
[#&#8203;61407](https://redirect.github.com/nodejs/node/pull/61407)
-
\[[`ef868447bc`](https://redirect.github.com/nodejs/node/commit/ef868447bc)]
- **src**: fix pointer alignment (jhofstee)
[#&#8203;61336](https://redirect.github.com/nodejs/node/pull/61336)
-
\[[`a96256524c`](https://redirect.github.com/nodejs/node/commit/a96256524c)]
- **src**: dump snapshot source with
node:generate\_default\_snapshot\_source (Joyee Cheung)
[#&#8203;61101](https://redirect.github.com/nodejs/node/pull/61101)
-
\[[`ec051b9efd`](https://redirect.github.com/nodejs/node/commit/ec051b9efd)]
- **src**: add HandleScope to edge loop in heap\_utils (Mert Can Altin)
[#&#8203;60885](https://redirect.github.com/nodejs/node/pull/60885)
-
\[[`41749eb5d6`](https://redirect.github.com/nodejs/node/commit/41749eb5d6)]
- **src**: remove redundant CHECK (Tobias Nießen)
[#&#8203;61130](https://redirect.github.com/nodejs/node/pull/61130)
-
\[[`57c81e5af3`](https://redirect.github.com/nodejs/node/commit/57c81e5af3)]
- **src**: fix off-thread cert loading in bundled cert mode (Joyee
Cheung)
[#&#8203;60764](https://redirect.github.com/nodejs/node/pull/60764)
-
\[[`4b0616e024`](https://redirect.github.com/nodejs/node/commit/4b0616e024)]
- **src**: handle DER decoding errors from system certificates (Joyee
Cheung)
[#&#8203;60787](https://redirect.github.com/nodejs/node/pull/60787)
-
\[[`93393371f9`](https://redirect.github.com/nodejs/node/commit/93393371f9)]
- **src**: use static\_cast instead of C-style cast (Michaël Zasso)
[#&#8203;60868](https://redirect.github.com/nodejs/node/pull/60868)
-
\[[`900445b655`](https://redirect.github.com/nodejs/node/commit/900445b655)]
- **src**: move Node-API version detection to where it is used (Anna
Henningsen)
[#&#8203;60512](https://redirect.github.com/nodejs/node/pull/60512)
-
\[[`8353a6da2a`](https://redirect.github.com/nodejs/node/commit/8353a6da2a)]
- **src**: avoid C strings in more C++ exception throws (Anna
Henningsen)
[#&#8203;60592](https://redirect.github.com/nodejs/node/pull/60592)
-
\[[`27c860c51f`](https://redirect.github.com/nodejs/node/commit/27c860c51f)]
- **src**: move `napi_addon_register_func` to `node_api_types.h` (Anna
Henningsen)
[#&#8203;60512](https://redirect.github.com/nodejs/node/pull/60512)
-
\[[`e0517752e7`](https://redirect.github.com/nodejs/node/commit/e0517752e7)]
- **src**: remove unconditional NAPI\_EXPERIMENTAL in node.h (Chengzhong
Wu) [#&#8203;60345](https://redirect.github.com/nodejs/node/pull/60345)
-
\[[`21e2a52f8e`](https://redirect.github.com/nodejs/node/commit/21e2a52f8e)]
- **src**: clean up generic counter implementation (Anna Henningsen)
[#&#8203;60447](https://redirect.github.com/nodejs/node/pull/60447)
-
\[[`aed23cb8ca`](https://redirect.github.com/nodejs/node/commit/aed23cb8ca)]
- **src**: add enum handle for ToStringHelper + formatting (Burkov Egor)
[#&#8203;56829](https://redirect.github.com/nodejs/node/pull/56829)
-
\[[`2e93650ebc`](https://redirect.github.com/nodejs/node/commit/2e93650ebc)]
- **src**: fix timing of snapshot serialize callback (Joyee Cheung)
[#&#8203;60434](https://redirect.github.com/nodejs/node/pull/60434)
-
\[[`ece4acc18f`](https://redirect.github.com/nodejs/node/commit/ece4acc18f)]
- **src**: add COUNT\_GENERIC\_USAGE utility for tests (Joyee Cheung)
[#&#8203;60434](https://redirect.github.com/nodejs/node/pull/60434)
-
\[[`31c8e9d9ff`](https://redirect.github.com/nodejs/node/commit/31c8e9d9ff)]
- **src**: use cached primordials\_string (Sohyeon Kim)
[#&#8203;60255](https://redirect.github.com/nodejs/node/pull/60255)
-
\[[`7f0ffddc14`](https://redirect.github.com/nodejs/node/commit/7f0ffddc14)]
- **src**: implement Windows-1252 encoding support and update related
tests (Mert Can Altin)
[#&#8203;60893](https://redirect.github.com/nodejs/node/pull/60893)
-
\[[`c2ba56d6b2`](https://redirect.github.com/nodejs/node/commit/c2ba56d6b2)]
- **src,permission**: fix permission.has on empty param (Rafael Gonzaga)
[#&#8203;60674](https://redirect.github.com/nodejs/node/pull/60674)
-
\[[`e55a2b895a`](https://redirect.github.com/nodejs/node/commit/e55a2b895a)]
- **src,permission**: add debug log on is\_tree\_granted (Rafael
Gonzaga)
[#&#8203;60668](https://redirect.github.com/nodejs/node/pull/60668)
-
\[[`902a78b43c`](https://redirect.github.com/nodejs/node/commit/902a78b43c)]
- **stream**: fix isErrored/isWritable for WritableStreams (René)
[#&#8203;60905](https://redirect.github.com/nodejs/node/pull/60905)
-
\[[`221b77cf41`](https://redirect.github.com/nodejs/node/commit/221b77cf41)]
- **stream**: don't try to read more if reading (Robert Nagy)
[#&#8203;60454](https://redirect.github.com/nodejs/node/pull/60454)
-
\[[`46d12d826f`](https://redirect.github.com/nodejs/node/commit/46d12d826f)]
- **test**: skip strace test with shared openssl (Richard Lau)
[#&#8203;61987](https://redirect.github.com/nodejs/node/pull/61987)
-
\[[`52e6b01a44`](https://redirect.github.com/nodejs/node/commit/52e6b01a44)]
- **test**: mark `test-strace-openat-openssl` as flaky (Antoine du
Hamel)
[#&#8203;61921](https://redirect.github.com/nodejs/node/pull/61921)
-
\[[`4d7468d0e0`](https://redirect.github.com/nodejs/node/commit/4d7468d0e0)]
- **test**: skip --build-sea tests on platforms where SEA is flaky
(Joyee Cheung)
[#&#8203;61504](https://redirect.github.com/nodejs/node/pull/61504)
-
\[[`f604b7ae67`](https://redirect.github.com/nodejs/node/commit/f604b7ae67)]
- **test**: fix flaky debugger test (Ryuhei Shima)
[#&#8203;58324](https://redirect.github.com/nodejs/node/pull/58324)
-
\[[`fc2dc4024b`](https://redirect.github.com/nodejs/node/commit/fc2dc4024b)]
- **test**: ensure removeListener event fires for once() listeners
(sangwook)
[#&#8203;60137](https://redirect.github.com/nodejs/node/pull/60137)
-
\[[`5fba382816`](https://redirect.github.com/nodejs/node/commit/5fba382816)]
- **test**: delay writing the files only on macOS (Luigi Pinca)
[#&#8203;61532](https://redirect.github.com/nodejs/node/pull/61532)
- \[[`85cc9e20e4`](https://red

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/toeverything/AFFiNE).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0My41OS4wIiwidXBkYXRlZEluVmVyIjoiNDMuNTkuMCIsInRhcmdldEJyYW5jaCI6ImNhbmFyeSIsImxhYmVscyI6WyJkZXBlbmRlbmNpZXMiXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-03-09 18:16:47 +08:00
Kenneth Wußmann
0b47f92134 fix(oidc): allow string boolean in email_verified userinfo schema (#14609)
## Why

When using AWS Cognito as OIDC provider, AFFiNE returns a zod parsing
error because AWS returns `email_verified` as a string in the userinfo
response.

```json
{
    "sub": "[UUID]",
    "email_verified": "true",
    "custom:mycustom1": "CustomValue",
    "phone_number_verified": "true",
    "phone_number": "+12065551212",
    "email": "bob@example.com",
    "username": "bob"
}
```

Reference:
https://docs.aws.amazon.com/cognito/latest/developerguide/userinfo-endpoint.html#get-userinfo-response-sample

Error returned in AFFiNE frontend:
```
Validation error, errors: [ { "code": "invalid_type", "expected": "boolean", "received": "string", "path": [ "email_verified" ], "message": "Expected boolean, received string" } ]
```

## What

I'm adjusting the existing `OIDCUserInfoSchema` to allow `z.boolean()`
and `z.enum(['true', 'false', '0', '1', 'yes', 'no'])`.
This matches with [our `extractBoolean` function in the
`OIDCProvider`](82e6239957/packages/backend/server/src/plugins/oauth/providers/oidc.ts (L269-L285)),
which already parses string as booleans in `email_verified`. But because
the userinfo response is parsed with zod first, it's failing before
reaching our `extractBoolean`.

> [!NOTE]
> We are using zod v3. In zod v4 they [added support for
`z.stringbool()`](https://zod.dev/api?id=stringbool) which would make
this easier.


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->

## Summary by CodeRabbit

## Release Notes

* **Bug Fixes**
* Enhanced OpenID Connect provider authentication to accept flexible
formats for email verification status, including various string
representations alongside boolean values.

<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-09 02:53:43 +00:00
DarkSky
9c55edeb62 feat(server): adapt gemini3.1 preview (#14583)
#### PR Dependency Tree


* **PR #14583** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* Added Gemini 3.1 Pro Preview support (text, image, audio) and new
GPT‑5 variants as defaults; centralized persistent telemetry state for
more reliable client identity.

* **UX**
  * Improved model submenu placement in chat preferences.
* More robust mindmap parsing, preview, regeneration and replace
behavior.

* **Chores**
  * Bumped AI SDK and related dependencies.

* **Tests**
  * Expanded/updated tests and increased timeouts for flaky flows.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-08 00:53:16 +08:00
DarkSky
9742e9735e feat(editor): improve edgeless perf & memory usage (#14591)
#### PR Dependency Tree


* **PR #14591** 👈

This tree was auto-generated by
[Charcoal](https://github.com/danerwilliams/charcoal)

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

* **New Features**
* New canvas renderer debug metrics and controls for runtime inspection.
* Mindmap/group reordering now normalizes group targets, improving
reorder consistency.

* **Bug Fixes**
  * Fixed connector behavior for empty/degenerate paths.
* More aggressive viewport invalidation so structural changes display
correctly.
* Improved z-index synchronization during transforms and layer updates.

* **Performance**
* Retained DOM caching for brushes, shapes, and connectors to reduce DOM
churn.
* Targeted canvas refreshes, pooling, and reuse to lower redraw and
memory overhead.

* **Tests**
* Added canvas renderer performance benchmarks and curve edge-case unit
tests.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
2026-03-07 09:12:14 +08:00
103 changed files with 7097 additions and 3501 deletions

View File

@@ -19,3 +19,8 @@ rustflags = [
# pthread_key_create() destructors and segfault after a DSO unloading
[target.'cfg(all(target_env = "gnu", not(target_os = "windows")))']
rustflags = ["-C", "link-args=-Wl,-z,nodelete"]
# Temporary local llm_adapter override.
# Uncomment when verifying AFFiNE against the sibling llm_adapter workspace.
# [patch.crates-io]
# llm_adapter = { path = "../llm_adapter" }

View File

@@ -971,7 +971,7 @@
},
"scenarios": {
"type": "object",
"description": "Use custom models in scenarios and override default settings.\n@default {\"override_enabled\":false,\"scenarios\":{\"audio_transcribing\":\"gemini-2.5-flash\",\"chat\":\"gemini-2.5-flash\",\"embedding\":\"gemini-embedding-001\",\"image\":\"gpt-image-1\",\"rerank\":\"gpt-4.1\",\"coding\":\"claude-sonnet-4-5@20250929\",\"complex_text_generation\":\"gpt-4o-2024-08-06\",\"quick_decision_making\":\"gpt-5-mini\",\"quick_text_generation\":\"gemini-2.5-flash\",\"polish_and_summarize\":\"gemini-2.5-flash\"}}",
"description": "Use custom models in scenarios and override default settings.\n@default {\"override_enabled\":false,\"scenarios\":{\"audio_transcribing\":\"gemini-2.5-flash\",\"chat\":\"gemini-2.5-flash\",\"embedding\":\"gemini-embedding-001\",\"image\":\"gpt-image-1\",\"coding\":\"claude-sonnet-4-5@20250929\",\"complex_text_generation\":\"gpt-5-mini\",\"quick_decision_making\":\"gpt-5-mini\",\"quick_text_generation\":\"gemini-2.5-flash\",\"polish_and_summarize\":\"gemini-2.5-flash\"}}",
"default": {
"override_enabled": false,
"scenarios": {
@@ -979,9 +979,8 @@
"chat": "gemini-2.5-flash",
"embedding": "gemini-embedding-001",
"image": "gpt-image-1",
"rerank": "gpt-4.1",
"coding": "claude-sonnet-4-5@20250929",
"complex_text_generation": "gpt-4o-2024-08-06",
"complex_text_generation": "gpt-5-mini",
"quick_decision_making": "gpt-5-mini",
"quick_text_generation": "gemini-2.5-flash",
"polish_and_summarize": "gemini-2.5-flash"

View File

@@ -31,10 +31,10 @@ podSecurityContext:
resources:
limits:
cpu: '1'
memory: 4Gi
memory: 6Gi
requests:
cpu: '1'
memory: 2Gi
memory: 4Gi
probe:
initialDelaySeconds: 20

2
.nvmrc
View File

@@ -1 +1 @@
22.22.0
22.22.1

485
Cargo.lock generated
View File

@@ -186,6 +186,7 @@ dependencies = [
"libwebp-sys",
"little_exif",
"llm_adapter",
"matroska",
"mimalloc",
"mp4parse",
"napi",
@@ -480,12 +481,6 @@ dependencies = [
"num-traits",
]
[[package]]
name = "atomic-waker"
version = "1.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1505bd5d3d116872e7271a6d4e16d81d0c8570876c8de68093a09ac269d8aac0"
[[package]]
name = "auto_enums"
version = "0.8.7"
@@ -504,28 +499,6 @@ version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8"
[[package]]
name = "aws-lc-rs"
version = "1.16.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94bffc006df10ac2a68c83692d734a465f8ee6c5b384d8545a636f81d858f4bf"
dependencies = [
"aws-lc-sys",
"zeroize",
]
[[package]]
name = "aws-lc-sys"
version = "0.38.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4321e568ed89bb5a7d291a7f37997c2c0df89809d7b6d12062c81ddb54aa782e"
dependencies = [
"cc",
"cmake",
"dunce",
"fs_extra",
]
[[package]]
name = "base64"
version = "0.22.1"
@@ -649,6 +622,15 @@ dependencies = [
"cfg-if",
]
[[package]]
name = "bitstream-io"
version = "3.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "680575de65ce8b916b82a447458b94a48776707d9c2681a9d8da351c06886a1f"
dependencies = [
"core2",
]
[[package]]
name = "block-buffer"
version = "0.10.4"
@@ -981,15 +963,6 @@ version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1d728cc89cf3aee9ff92b05e62b19ee65a02b5702cff7d5a377e32c6ae29d8d"
[[package]]
name = "cmake"
version = "0.1.57"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75443c44cd6b379beb8c5b45d85d0773baf31cce901fe7bb252f4eff3008ef7d"
dependencies = [
"cc",
]
[[package]]
name = "color_quant"
version = "1.1.0"
@@ -1534,12 +1507,6 @@ version = "0.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f678cf4a922c215c63e0de95eb1ff08a958a81d47e485cf9da1e27bf6305cfa5"
[[package]]
name = "dunce"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "92773504d58c093f6de2459af4af33faa518c13451eb8f2b5698ed3d36e7c813"
[[package]]
name = "ecb"
version = "0.1.2"
@@ -1771,12 +1738,6 @@ dependencies = [
"autocfg",
]
[[package]]
name = "fs_extra"
version = "1.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42703706b716c37f96a77aea830392ad231f44c9e9a67872fa5548707e11b11c"
[[package]]
name = "futf"
version = "0.1.5"
@@ -1941,11 +1902,9 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "899def5c37c4fd7b2664648c28120ecec138e4d395b459e5ca34f9cce2dd77fd"
dependencies = [
"cfg-if",
"js-sys",
"libc",
"r-efi",
"wasip2",
"wasm-bindgen",
]
[[package]]
@@ -2138,95 +2097,12 @@ dependencies = [
"itoa",
]
[[package]]
name = "http-body"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1efedce1fb8e6913f23e0c92de8e62cd5b772a67e7b3946df930a62566c93184"
dependencies = [
"bytes",
"http",
]
[[package]]
name = "http-body-util"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b021d93e26becf5dc7e1b75b1bed1fd93124b374ceb73f43d4d4eafec896a64a"
dependencies = [
"bytes",
"futures-core",
"http",
"http-body",
"pin-project-lite",
]
[[package]]
name = "httparse"
version = "1.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6dbf3de79e51f3d586ab4cb9d5c3e2c14aa28ed23d180cf89b4df0454a69cc87"
[[package]]
name = "hyper"
version = "1.8.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2ab2d4f250c3d7b1c9fcdff1cece94ea4e2dfbec68614f7b87cb205f24ca9d11"
dependencies = [
"atomic-waker",
"bytes",
"futures-channel",
"futures-core",
"http",
"http-body",
"httparse",
"itoa",
"pin-project-lite",
"pin-utils",
"smallvec",
"tokio",
"want",
]
[[package]]
name = "hyper-rustls"
version = "0.27.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e3c93eb611681b207e1fe55d5a71ecf91572ec8a6705cdb6857f7d8d5242cf58"
dependencies = [
"http",
"hyper",
"hyper-util",
"rustls",
"rustls-pki-types",
"tokio",
"tokio-rustls",
"tower-service",
]
[[package]]
name = "hyper-util"
version = "0.1.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "96547c2556ec9d12fb1578c4eaf448b04993e7fb79cbaad930a656880a6bdfa0"
dependencies = [
"base64",
"bytes",
"futures-channel",
"futures-util",
"http",
"http-body",
"hyper",
"ipnet",
"libc",
"percent-encoding",
"pin-project-lite",
"socket2",
"tokio",
"tower-service",
"tracing",
]
[[package]]
name = "iana-time-zone"
version = "0.1.64"
@@ -2505,22 +2381,6 @@ dependencies = [
"leaky-cow",
]
[[package]]
name = "ipnet"
version = "2.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "469fb0b9cefa57e3ef31275ee7cacb78f2fdca44e4765491884a2b119d4eb130"
[[package]]
name = "iri-string"
version = "0.7.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c91338f0783edbd6195decb37bae672fd3b165faffb89bf7b9e6942f8b1a731a"
dependencies = [
"memchr",
"serde",
]
[[package]]
name = "is-terminal"
version = "0.4.17"
@@ -2813,15 +2673,15 @@ dependencies = [
[[package]]
name = "llm_adapter"
version = "0.1.1"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8dd9a548766bccf8b636695e8d514edee672d180e96a16ab932c971783b4e353"
checksum = "e98485dda5180cc89b993a001688bed93307be6bd8fedcde445b69bbca4f554d"
dependencies = [
"base64",
"reqwest",
"serde",
"serde_json",
"thiserror 2.0.17",
"ureq",
]
[[package]]
@@ -2889,12 +2749,6 @@ dependencies = [
"hashbrown 0.16.1",
]
[[package]]
name = "lru-slab"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "112b39cec0b298b6c1999fee3e31427f74f676e4cb9879ed1a121b43661a4154"
[[package]]
name = "mac"
version = "0.1.1"
@@ -2954,6 +2808,16 @@ dependencies = [
"regex-automata",
]
[[package]]
name = "matroska"
version = "0.30.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fde85cd7fb5cf875c4a46fac0cbd6567d413bea2538cef6788e3a0e52a902b45"
dependencies = [
"bitstream-io",
"phf 0.11.3",
]
[[package]]
name = "md-5"
version = "0.10.6"
@@ -3396,12 +3260,6 @@ version = "11.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d6790f58c7ff633d8771f42965289203411a5e5c68388703c06e14f24770b41e"
[[package]]
name = "openssl-probe"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c87def4c32ab89d880effc9e097653c8da5d6ef28e6b539d313baaacfbafcbe"
[[package]]
name = "ordered-float"
version = "5.1.0"
@@ -3884,62 +3742,6 @@ dependencies = [
"memchr",
]
[[package]]
name = "quinn"
version = "0.11.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b9e20a958963c291dc322d98411f541009df2ced7b5a4f2bd52337638cfccf20"
dependencies = [
"bytes",
"cfg_aliases",
"pin-project-lite",
"quinn-proto",
"quinn-udp",
"rustc-hash 2.1.1",
"rustls",
"socket2",
"thiserror 2.0.17",
"tokio",
"tracing",
"web-time",
]
[[package]]
name = "quinn-proto"
version = "0.11.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f1906b49b0c3bc04b5fe5d86a77925ae6524a19b816ae38ce1e426255f1d8a31"
dependencies = [
"aws-lc-rs",
"bytes",
"getrandom 0.3.4",
"lru-slab",
"rand 0.9.2",
"ring",
"rustc-hash 2.1.1",
"rustls",
"rustls-pki-types",
"slab",
"thiserror 2.0.17",
"tinyvec",
"tracing",
"web-time",
]
[[package]]
name = "quinn-udp"
version = "0.5.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "addec6a0dcad8a8d96a771f815f0eaf55f9d1805756410b39f5fa81332574cbd"
dependencies = [
"cfg_aliases",
"libc",
"once_cell",
"socket2",
"tracing",
"windows-sys 0.60.2",
]
[[package]]
name = "quote"
version = "1.0.43"
@@ -4128,45 +3930,6 @@ version = "0.8.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7a2d987857b319362043e95f5353c0535c1f58eec5336fdfcf626430af7def58"
[[package]]
name = "reqwest"
version = "0.13.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab3f43e3283ab1488b624b44b0e988d0acea0b3214e694730a055cb6b2efa801"
dependencies = [
"base64",
"bytes",
"futures-channel",
"futures-core",
"futures-util",
"http",
"http-body",
"http-body-util",
"hyper",
"hyper-rustls",
"hyper-util",
"js-sys",
"log",
"percent-encoding",
"pin-project-lite",
"quinn",
"rustls",
"rustls-pki-types",
"rustls-platform-verifier",
"serde",
"serde_json",
"sync_wrapper",
"tokio",
"tokio-rustls",
"tower",
"tower-http",
"tower-service",
"url",
"wasm-bindgen",
"wasm-bindgen-futures",
"web-sys",
]
[[package]]
name = "ring"
version = "0.17.14"
@@ -4307,7 +4070,7 @@ version = "0.23.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c665f33d38cea657d9614f766881e4d510e0eda4239891eea56b4cadcf01801b"
dependencies = [
"aws-lc-rs",
"log",
"once_cell",
"ring",
"rustls-pki-types",
@@ -4316,62 +4079,21 @@ dependencies = [
"zeroize",
]
[[package]]
name = "rustls-native-certs"
version = "0.8.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "612460d5f7bea540c490b2b6395d8e34a953e52b491accd6c86c8164c5932a63"
dependencies = [
"openssl-probe",
"rustls-pki-types",
"schannel",
"security-framework",
]
[[package]]
name = "rustls-pki-types"
version = "1.13.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "21e6f2ab2928ca4291b86736a8bd920a277a399bba1589409d72154ff87c1282"
dependencies = [
"web-time",
"zeroize",
]
[[package]]
name = "rustls-platform-verifier"
version = "0.6.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1d99feebc72bae7ab76ba994bb5e121b8d83d910ca40b36e0921f53becc41784"
dependencies = [
"core-foundation",
"core-foundation-sys",
"jni",
"log",
"once_cell",
"rustls",
"rustls-native-certs",
"rustls-platform-verifier-android",
"rustls-webpki",
"security-framework",
"security-framework-sys",
"webpki-root-certs",
"windows-sys 0.61.2",
]
[[package]]
name = "rustls-platform-verifier-android"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f87165f0995f63a9fbeea62b64d10b4d9d8e78ec6d7d51fb2125fda7bb36788f"
[[package]]
name = "rustls-webpki"
version = "0.103.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2ffdfa2f5286e2247234e03f680868ac2815974dc39e00ea15adc445d0aafe52"
dependencies = [
"aws-lc-rs",
"ring",
"rustls-pki-types",
"untrusted",
@@ -4410,15 +4132,6 @@ dependencies = [
"winapi-util",
]
[[package]]
name = "schannel"
version = "0.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "891d81b926048e76efe18581bf793546b4c0eaf8448d72be8de2bbee5fd166e1"
dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "scoped-tls"
version = "1.0.1"
@@ -4467,29 +4180,6 @@ dependencies = [
"syn 2.0.114",
]
[[package]]
name = "security-framework"
version = "3.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b7f4bc775c73d9a02cde8bf7b2ec4c9d12743edf609006c7facc23998404cd1d"
dependencies = [
"bitflags 2.11.0",
"core-foundation",
"core-foundation-sys",
"libc",
"security-framework-sys",
]
[[package]]
name = "security-framework-sys"
version = "2.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6ce2691df843ecc5d231c0b14ece2acc3efb62c0a398c7e1d875f3983ce020e3"
dependencies = [
"core-foundation-sys",
"libc",
]
[[package]]
name = "semver"
version = "1.0.27"
@@ -5215,15 +4905,6 @@ dependencies = [
"unicode-ident",
]
[[package]]
name = "sync_wrapper"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0bf256ce5efdfa370213c1dabab5935a12e49f2c58d15e9eac2870d3b4f27263"
dependencies = [
"futures-core",
]
[[package]]
name = "synstructure"
version = "0.13.2"
@@ -5415,16 +5096,6 @@ dependencies = [
"syn 2.0.114",
]
[[package]]
name = "tokio-rustls"
version = "0.26.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1729aa945f29d91ba541258c8df89027d5792d85a8841fb65e8bf0f4ede4ef61"
dependencies = [
"rustls",
"tokio",
]
[[package]]
name = "tokio-stream"
version = "0.1.18"
@@ -5475,51 +5146,6 @@ dependencies = [
"winnow",
]
[[package]]
name = "tower"
version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ebe5ef63511595f1344e2d5cfa636d973292adc0eec1f0ad45fae9f0851ab1d4"
dependencies = [
"futures-core",
"futures-util",
"pin-project-lite",
"sync_wrapper",
"tokio",
"tower-layer",
"tower-service",
]
[[package]]
name = "tower-http"
version = "0.6.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d4e6559d53cc268e5031cd8429d05415bc4cb4aefc4aa5d6cc35fbf5b924a1f8"
dependencies = [
"bitflags 2.11.0",
"bytes",
"futures-util",
"http",
"http-body",
"iri-string",
"pin-project-lite",
"tower",
"tower-layer",
"tower-service",
]
[[package]]
name = "tower-layer"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "121c2a6cda46980bb0fcd1647ffaf6cd3fc79a013de288782836f6df9c48780e"
[[package]]
name = "tower-service"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8df9b6e13f2d32c91b9bd719c00d1958837bc7dec474d94952798cc8e69eeec3"
[[package]]
name = "tracing"
version = "0.1.44"
@@ -5722,12 +5348,6 @@ dependencies = [
"tree-sitter-language",
]
[[package]]
name = "try-lock"
version = "0.2.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e421abadd41a4225275504ea4d6566923418b7f05506fbc9c0fe86ba7396114b"
[[package]]
name = "type1-encoding-parser"
version = "0.1.0"
@@ -5952,6 +5572,35 @@ version = "0.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6d49784317cd0d1ee7ec5c716dd598ec5b4483ea832a2dced265471cc0f690ae"
[[package]]
name = "ureq"
version = "3.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fdc97a28575b85cfedf2a7e7d3cc64b3e11bd8ac766666318003abbacc7a21fc"
dependencies = [
"base64",
"flate2",
"log",
"percent-encoding",
"rustls",
"rustls-pki-types",
"ureq-proto",
"utf-8",
"webpki-roots 1.0.5",
]
[[package]]
name = "ureq-proto"
version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d81f9efa9df032be5934a46a068815a10a042b494b6a58cb0a1a97bb5467ed6f"
dependencies = [
"base64",
"http",
"httparse",
"log",
]
[[package]]
name = "url"
version = "2.5.8"
@@ -6048,15 +5697,6 @@ dependencies = [
"winapi-util",
]
[[package]]
name = "want"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bfa7760aed19e106de2c7c0b581b509f2f25d3dacaf737cb82ac61bc6d760b0e"
dependencies = [
"try-lock",
]
[[package]]
name = "wasi"
version = "0.11.1+wasi-snapshot-preview1"
@@ -6146,25 +5786,6 @@ dependencies = [
"wasm-bindgen",
]
[[package]]
name = "web-time"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a6580f308b1fad9207618087a65c04e7a10bc77e02c8e84e9b00dd4b12fa0bb"
dependencies = [
"js-sys",
"wasm-bindgen",
]
[[package]]
name = "webpki-root-certs"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "804f18a4ac2676ffb4e8b5b5fa9ae38af06df08162314f96a68d2a363e21a8ca"
dependencies = [
"rustls-pki-types",
]
[[package]]
name = "webpki-roots"
version = "0.26.11"

View File

@@ -53,10 +53,11 @@ resolver = "3"
libc = "0.2"
libwebp-sys = "0.14.2"
little_exif = "0.6.23"
llm_adapter = "0.1.1"
llm_adapter = { version = "0.1.3", default-features = false }
log = "0.4"
loom = { version = "0.7", features = ["checkpoint"] }
lru = "0.16"
matroska = "0.30"
memory-indexer = "0.3.0"
mimalloc = "0.1"
mp4parse = "0.17"

View File

@@ -22,6 +22,7 @@ import {
FrameBlockModel,
ImageBlockModel,
isExternalEmbedModel,
MindmapElementModel,
NoteBlockModel,
ParagraphBlockModel,
} from '@blocksuite/affine-model';
@@ -401,7 +402,17 @@ function reorderElements(
) {
if (!models.length) return;
for (const model of models) {
const normalizedModels = Array.from(
new Map(
models.map(model => {
const reorderTarget =
model.group instanceof MindmapElementModel ? model.group : model;
return [reorderTarget.id, reorderTarget];
})
).values()
);
for (const model of normalizedModels) {
const index = ctx.gfx.layer.getReorderedIndex(model, type);
// block should be updated in transaction

View File

@@ -2,16 +2,24 @@ import { type Color, ColorScheme } from '@blocksuite/affine-model';
import { FeatureFlagService } from '@blocksuite/affine-shared/services';
import { requestConnectedFrame } from '@blocksuite/affine-shared/utils';
import { DisposableGroup } from '@blocksuite/global/disposable';
import type { IBound } from '@blocksuite/global/gfx';
import { getBoundWithRotation, intersects } from '@blocksuite/global/gfx';
import {
Bound,
getBoundWithRotation,
type IBound,
intersects,
} from '@blocksuite/global/gfx';
import type { BlockStdScope } from '@blocksuite/std';
import type {
GfxCompatibleInterface,
GfxController,
GfxLocalElementModel,
GridManager,
LayerManager,
SurfaceBlockModel,
Viewport,
} from '@blocksuite/std/gfx';
import { GfxControllerIdentifier } from '@blocksuite/std/gfx';
import { effect } from '@preact/signals-core';
import last from 'lodash-es/last';
import { Subject } from 'rxjs';
@@ -40,11 +48,82 @@ type RendererOptions = {
surfaceModel: SurfaceBlockModel;
};
export type CanvasRenderPassMetrics = {
overlayCount: number;
placeholderElementCount: number;
renderByBoundCallCount: number;
renderedElementCount: number;
visibleElementCount: number;
};
export type CanvasMemorySnapshot = {
bytes: number;
datasetLayerId: string | null;
height: number;
kind: 'main' | 'stacking';
width: number;
zIndex: string;
};
export type CanvasRendererDebugMetrics = {
canvasLayerCount: number;
canvasMemoryBytes: number;
canvasMemorySnapshots: CanvasMemorySnapshot[];
canvasMemoryMegabytes: number;
canvasPixelCount: number;
coalescedRefreshCount: number;
dirtyLayerRenderCount: number;
fallbackElementCount: number;
lastRenderDurationMs: number;
lastRenderMetrics: CanvasRenderPassMetrics;
maxRenderDurationMs: number;
pooledStackingCanvasCount: number;
refreshCount: number;
renderCount: number;
stackingCanvasCount: number;
totalLayerCount: number;
totalRenderDurationMs: number;
visibleStackingCanvasCount: number;
};
type MutableCanvasRendererDebugMetrics = Omit<
CanvasRendererDebugMetrics,
| 'canvasLayerCount'
| 'canvasMemoryBytes'
| 'canvasMemoryMegabytes'
| 'canvasPixelCount'
| 'canvasMemorySnapshots'
| 'pooledStackingCanvasCount'
| 'stackingCanvasCount'
| 'totalLayerCount'
| 'visibleStackingCanvasCount'
>;
type RenderPassStats = CanvasRenderPassMetrics;
type StackingCanvasState = {
bound: Bound | null;
layerId: string | null;
};
type RefreshTarget =
| { type: 'all' }
| { type: 'main' }
| { type: 'element'; element: SurfaceElementModel | GfxLocalElementModel }
| {
type: 'elements';
elements: Array<SurfaceElementModel | GfxLocalElementModel>;
};
const STACKING_CANVAS_PADDING = 32;
export class CanvasRenderer {
private _container!: HTMLElement;
private readonly _disposables = new DisposableGroup();
private readonly _gfx: GfxController;
private readonly _turboEnabled: () => boolean;
private readonly _overlays = new Set<Overlay>();
@@ -53,6 +132,37 @@ export class CanvasRenderer {
private _stackingCanvas: HTMLCanvasElement[] = [];
private readonly _stackingCanvasPool: HTMLCanvasElement[] = [];
private readonly _stackingCanvasState = new WeakMap<
HTMLCanvasElement,
StackingCanvasState
>();
private readonly _dirtyStackingCanvasIndexes = new Set<number>();
private _mainCanvasDirty = true;
private _needsFullRender = true;
private _debugMetrics: MutableCanvasRendererDebugMetrics = {
refreshCount: 0,
coalescedRefreshCount: 0,
renderCount: 0,
totalRenderDurationMs: 0,
lastRenderDurationMs: 0,
maxRenderDurationMs: 0,
lastRenderMetrics: {
renderByBoundCallCount: 0,
visibleElementCount: 0,
renderedElementCount: 0,
placeholderElementCount: 0,
overlayCount: 0,
},
dirtyLayerRenderCount: 0,
fallbackElementCount: 0,
};
canvas: HTMLCanvasElement;
ctx: CanvasRenderingContext2D;
@@ -89,6 +199,7 @@ export class CanvasRenderer {
this.layerManager = options.layerManager;
this.grid = options.gridManager;
this.provider = options.provider ?? {};
this._gfx = this.std.get(GfxControllerIdentifier);
this._turboEnabled = () => {
const featureFlagService = options.std.get(FeatureFlagService);
@@ -132,15 +243,199 @@ export class CanvasRenderer {
};
}
private _applyStackingCanvasLayout(
canvas: HTMLCanvasElement,
bound: Bound | null,
dpr = window.devicePixelRatio
) {
const state =
this._stackingCanvasState.get(canvas) ??
({
bound: null,
layerId: canvas.dataset.layerId ?? null,
} satisfies StackingCanvasState);
if (!bound || bound.w <= 0 || bound.h <= 0) {
canvas.style.display = 'none';
canvas.style.left = '0px';
canvas.style.top = '0px';
canvas.style.width = '0px';
canvas.style.height = '0px';
canvas.style.transform = '';
canvas.width = 0;
canvas.height = 0;
state.bound = null;
state.layerId = canvas.dataset.layerId ?? null;
this._stackingCanvasState.set(canvas, state);
return;
}
const { viewportBounds, zoom, viewScale } = this.viewport;
const width = bound.w * zoom;
const height = bound.h * zoom;
const left = (bound.x - viewportBounds.x) * zoom;
const top = (bound.y - viewportBounds.y) * zoom;
const actualWidth = Math.max(1, Math.ceil(width * dpr));
const actualHeight = Math.max(1, Math.ceil(height * dpr));
const transform = `translate(${left}px, ${top}px) scale(${1 / viewScale})`;
if (canvas.style.display !== 'block') {
canvas.style.display = 'block';
}
if (canvas.style.left !== '0px') {
canvas.style.left = '0px';
}
if (canvas.style.top !== '0px') {
canvas.style.top = '0px';
}
if (canvas.style.width !== `${width}px`) {
canvas.style.width = `${width}px`;
}
if (canvas.style.height !== `${height}px`) {
canvas.style.height = `${height}px`;
}
if (canvas.style.transform !== transform) {
canvas.style.transform = transform;
}
if (canvas.style.transformOrigin !== 'top left') {
canvas.style.transformOrigin = 'top left';
}
if (canvas.width !== actualWidth) {
canvas.width = actualWidth;
}
if (canvas.height !== actualHeight) {
canvas.height = actualHeight;
}
state.bound = bound;
state.layerId = canvas.dataset.layerId ?? null;
this._stackingCanvasState.set(canvas, state);
}
private _clampBoundToViewport(bound: Bound, viewportBounds: Bound) {
const minX = Math.max(bound.x, viewportBounds.x);
const minY = Math.max(bound.y, viewportBounds.y);
const maxX = Math.min(bound.maxX, viewportBounds.maxX);
const maxY = Math.min(bound.maxY, viewportBounds.maxY);
if (maxX <= minX || maxY <= minY) {
return null;
}
return new Bound(minX, minY, maxX - minX, maxY - minY);
}
private _createCanvasForLayer(
onCreated?: (canvas: HTMLCanvasElement) => void
) {
const reused = this._stackingCanvasPool.pop();
if (reused) {
return reused;
}
const created = document.createElement('canvas');
onCreated?.(created);
return created;
}
private _findLayerIndexByElement(
element: SurfaceElementModel | GfxLocalElementModel
) {
const canvasLayers = this.layerManager.getCanvasLayers();
const index = canvasLayers.findIndex(layer =>
layer.elements.some(layerElement => layerElement.id === element.id)
);
return index === -1 ? null : index;
}
private _getLayerRenderBound(
elements: SurfaceElementModel[],
viewportBounds: Bound
) {
let layerBound: Bound | null = null;
for (const element of elements) {
const display = (element.display ?? true) && !element.hidden;
if (!display) {
continue;
}
const elementBound = Bound.from(getBoundWithRotation(element));
if (!intersects(elementBound, viewportBounds)) {
continue;
}
layerBound = layerBound ? layerBound.unite(elementBound) : elementBound;
}
if (!layerBound) {
return null;
}
return this._clampBoundToViewport(
layerBound.expand(STACKING_CANVAS_PADDING),
viewportBounds
);
}
private _getResolvedStackingCanvasBound(
canvas: HTMLCanvasElement,
bound: Bound | null
) {
if (!bound || !this._gfx.tool.dragging$.peek()) {
return bound;
}
const previousBound = this._stackingCanvasState.get(canvas)?.bound;
return previousBound ? previousBound.unite(bound) : bound;
}
private _invalidate(target: RefreshTarget = { type: 'all' }) {
if (target.type === 'all') {
this._needsFullRender = true;
this._mainCanvasDirty = true;
this._dirtyStackingCanvasIndexes.clear();
return;
}
if (this._needsFullRender) {
return;
}
if (target.type === 'main') {
this._mainCanvasDirty = true;
return;
}
const elements =
target.type === 'element' ? [target.element] : target.elements;
for (const element of elements) {
const layerIndex = this._findLayerIndexByElement(element);
if (layerIndex === null || layerIndex >= this._stackingCanvas.length) {
this._mainCanvasDirty = true;
continue;
}
this._dirtyStackingCanvasIndexes.add(layerIndex);
}
}
private _resetPooledCanvas(canvas: HTMLCanvasElement) {
canvas.dataset.layerId = '';
this._applyStackingCanvasLayout(canvas, null);
}
private _initStackingCanvas(onCreated?: (canvas: HTMLCanvasElement) => void) {
const layer = this.layerManager;
const updateStackingCanvasSize = (canvases: HTMLCanvasElement[]) => {
this._stackingCanvas = canvases;
const sizeUpdater = this._canvasSizeUpdater();
canvases.filter(sizeUpdater.filter).forEach(sizeUpdater.update);
};
const updateStackingCanvas = () => {
/**
* we already have a main canvas, so the last layer should be skipped
@@ -159,11 +454,7 @@ export class CanvasRenderer {
const created = i < currentCanvases.length;
const canvas = created
? currentCanvases[i]
: document.createElement('canvas');
if (!created) {
onCreated?.(canvas);
}
: this._createCanvasForLayer(onCreated);
canvas.dataset.layerId = `[${layer.indexes[0]}--${layer.indexes[1]}]`;
canvas.style.zIndex = layer.zIndex.toString();
@@ -171,7 +462,6 @@ export class CanvasRenderer {
}
this._stackingCanvas = canvases;
updateStackingCanvasSize(canvases);
if (currentCanvases.length !== canvases.length) {
const diff = canvases.length - currentCanvases.length;
@@ -189,12 +479,16 @@ export class CanvasRenderer {
payload.added = canvases.slice(-diff);
} else {
payload.removed = currentCanvases.slice(diff);
payload.removed.forEach(canvas => {
this._resetPooledCanvas(canvas);
this._stackingCanvasPool.push(canvas);
});
}
this.stackingCanvasUpdated.next(payload);
}
this.refresh();
this.refresh({ type: 'all' });
};
this._disposables.add(
@@ -211,7 +505,7 @@ export class CanvasRenderer {
this._disposables.add(
this.viewport.viewportUpdated.subscribe(() => {
this.refresh();
this.refresh({ type: 'all' });
})
);
@@ -222,7 +516,6 @@ export class CanvasRenderer {
sizeUpdatedRafId = null;
this._resetSize();
this._render();
this.refresh();
}, this._container);
})
);
@@ -233,69 +526,212 @@ export class CanvasRenderer {
if (this.usePlaceholder !== shouldRenderPlaceholders) {
this.usePlaceholder = shouldRenderPlaceholders;
this.refresh();
this.refresh({ type: 'all' });
}
})
);
let wasDragging = false;
this._disposables.add(
effect(() => {
const isDragging = this._gfx.tool.dragging$.value;
if (wasDragging && !isDragging) {
this.refresh({ type: 'all' });
}
wasDragging = isDragging;
})
);
this.usePlaceholder = false;
}
private _createRenderPassStats(): RenderPassStats {
return {
renderByBoundCallCount: 0,
visibleElementCount: 0,
renderedElementCount: 0,
placeholderElementCount: 0,
overlayCount: 0,
};
}
private _getCanvasMemorySnapshots(): CanvasMemorySnapshot[] {
return [this.canvas, ...this._stackingCanvas].map((canvas, index) => {
return {
kind: index === 0 ? 'main' : 'stacking',
width: canvas.width,
height: canvas.height,
bytes: canvas.width * canvas.height * 4,
zIndex: canvas.style.zIndex,
datasetLayerId: canvas.dataset.layerId ?? null,
};
});
}
private _render() {
const renderStart = performance.now();
const { viewportBounds, zoom } = this.viewport;
const { ctx } = this;
const dpr = window.devicePixelRatio;
const scale = zoom * dpr;
const matrix = new DOMMatrix().scaleSelf(scale);
const renderStats = this._createRenderPassStats();
const fullRender = this._needsFullRender;
const stackingIndexesToRender = fullRender
? this._stackingCanvas.map((_, idx) => idx)
: [...this._dirtyStackingCanvasIndexes];
/**
* if a layer does not have a corresponding canvas
* its element will be add to this array and drawing on the
* main canvas
*/
let fallbackElement: SurfaceElementModel[] = [];
const allCanvasLayers = this.layerManager.getCanvasLayers();
const viewportBound = Bound.from(viewportBounds);
this.layerManager.getCanvasLayers().forEach((layer, idx) => {
if (!this._stackingCanvas[idx]) {
fallbackElement = fallbackElement.concat(layer.elements);
return;
for (const idx of stackingIndexesToRender) {
const layer = allCanvasLayers[idx];
const canvas = this._stackingCanvas[idx];
if (!layer || !canvas) {
continue;
}
const canvas = this._stackingCanvas[idx];
const ctx = canvas.getContext('2d') as CanvasRenderingContext2D;
const rc = new RoughCanvas(ctx.canvas);
const layerRenderBound = this._getLayerRenderBound(
layer.elements,
viewportBound
);
const resolvedLayerRenderBound = this._getResolvedStackingCanvasBound(
canvas,
layerRenderBound
);
ctx.clearRect(0, 0, canvas.width, canvas.height);
this._applyStackingCanvasLayout(canvas, resolvedLayerRenderBound);
if (
!resolvedLayerRenderBound ||
canvas.width === 0 ||
canvas.height === 0
) {
continue;
}
const layerCtx = canvas.getContext('2d') as CanvasRenderingContext2D;
const layerRc = new RoughCanvas(layerCtx.canvas);
layerCtx.clearRect(0, 0, canvas.width, canvas.height);
layerCtx.save();
layerCtx.setTransform(matrix);
this._renderByBound(
layerCtx,
matrix,
layerRc,
resolvedLayerRenderBound,
layer.elements,
false,
renderStats
);
}
if (fullRender || this._mainCanvasDirty) {
allCanvasLayers.forEach((layer, idx) => {
if (!this._stackingCanvas[idx]) {
fallbackElement = fallbackElement.concat(layer.elements);
}
});
ctx.clearRect(0, 0, this.canvas.width, this.canvas.height);
ctx.save();
ctx.setTransform(matrix);
this._renderByBound(ctx, matrix, rc, viewportBounds, layer.elements);
});
this._renderByBound(
ctx,
matrix,
new RoughCanvas(ctx.canvas),
viewportBounds,
fallbackElement,
true,
renderStats
);
}
ctx.clearRect(0, 0, this.canvas.width, this.canvas.height);
ctx.save();
ctx.setTransform(matrix);
this._renderByBound(
ctx,
matrix,
new RoughCanvas(ctx.canvas),
viewportBounds,
fallbackElement,
true
const canvasMemorySnapshots = this._getCanvasMemorySnapshots();
const canvasMemoryBytes = canvasMemorySnapshots.reduce(
(sum, snapshot) => sum + snapshot.bytes,
0
);
const layerTypes = this.layerManager.layers.map(layer => layer.type);
const renderDurationMs = performance.now() - renderStart;
this._debugMetrics.renderCount += 1;
this._debugMetrics.totalRenderDurationMs += renderDurationMs;
this._debugMetrics.lastRenderDurationMs = renderDurationMs;
this._debugMetrics.maxRenderDurationMs = Math.max(
this._debugMetrics.maxRenderDurationMs,
renderDurationMs
);
this._debugMetrics.lastRenderMetrics = renderStats;
this._debugMetrics.fallbackElementCount = fallbackElement.length;
this._debugMetrics.dirtyLayerRenderCount = stackingIndexesToRender.length;
this._lastDebugSnapshot = {
canvasMemorySnapshots,
canvasMemoryBytes,
canvasPixelCount: canvasMemorySnapshots.reduce(
(sum, snapshot) => sum + snapshot.width * snapshot.height,
0
),
stackingCanvasCount: this._stackingCanvas.length,
canvasLayerCount: layerTypes.filter(type => type === 'canvas').length,
totalLayerCount: layerTypes.length,
pooledStackingCanvasCount: this._stackingCanvasPool.length,
visibleStackingCanvasCount: this._stackingCanvas.filter(
canvas => canvas.width > 0 && canvas.height > 0
).length,
};
this._needsFullRender = false;
this._mainCanvasDirty = false;
this._dirtyStackingCanvasIndexes.clear();
}
private _lastDebugSnapshot: Pick<
CanvasRendererDebugMetrics,
| 'canvasMemoryBytes'
| 'canvasMemorySnapshots'
| 'canvasPixelCount'
| 'canvasLayerCount'
| 'pooledStackingCanvasCount'
| 'stackingCanvasCount'
| 'totalLayerCount'
| 'visibleStackingCanvasCount'
> = {
canvasMemoryBytes: 0,
canvasMemorySnapshots: [],
canvasPixelCount: 0,
canvasLayerCount: 0,
pooledStackingCanvasCount: 0,
stackingCanvasCount: 0,
totalLayerCount: 0,
visibleStackingCanvasCount: 0,
};
private _renderByBound(
ctx: CanvasRenderingContext2D | null,
matrix: DOMMatrix,
rc: RoughCanvas,
bound: IBound,
surfaceElements?: SurfaceElementModel[],
overLay: boolean = false
overLay: boolean = false,
renderStats?: RenderPassStats
) {
if (!ctx) return;
renderStats && (renderStats.renderByBoundCallCount += 1);
const elements =
surfaceElements ??
(this.grid.search(bound, {
@@ -305,10 +741,12 @@ export class CanvasRenderer {
for (const element of elements) {
const display = (element.display ?? true) && !element.hidden;
if (display && intersects(getBoundWithRotation(element), bound)) {
renderStats && (renderStats.visibleElementCount += 1);
if (
this.usePlaceholder &&
!(element as GfxCompatibleInterface).forceFullRender
) {
renderStats && (renderStats.placeholderElementCount += 1);
ctx.save();
ctx.fillStyle = 'rgba(200, 200, 200, 0.5)';
const drawX = element.x - bound.x;
@@ -316,6 +754,7 @@ export class CanvasRenderer {
ctx.fillRect(drawX, drawY, element.w, element.h);
ctx.restore();
} else {
renderStats && (renderStats.renderedElementCount += 1);
ctx.save();
const renderFn = this.std.getOptional<ElementRenderer>(
ElementRendererIdentifier(element.type)
@@ -333,6 +772,7 @@ export class CanvasRenderer {
}
if (overLay) {
renderStats && (renderStats.overlayCount += this._overlays.size);
for (const overlay of this._overlays) {
ctx.save();
ctx.translate(-bound.x, -bound.y);
@@ -348,33 +788,38 @@ export class CanvasRenderer {
const sizeUpdater = this._canvasSizeUpdater();
sizeUpdater.update(this.canvas);
this._stackingCanvas.forEach(sizeUpdater.update);
this.refresh();
this._invalidate({ type: 'all' });
}
private _watchSurface(surfaceModel: SurfaceBlockModel) {
this._disposables.add(
surfaceModel.elementAdded.subscribe(() => this.refresh())
surfaceModel.elementAdded.subscribe(() => this.refresh({ type: 'all' }))
);
this._disposables.add(
surfaceModel.elementRemoved.subscribe(() => this.refresh())
surfaceModel.elementRemoved.subscribe(() => this.refresh({ type: 'all' }))
);
this._disposables.add(
surfaceModel.localElementAdded.subscribe(() => this.refresh())
surfaceModel.localElementAdded.subscribe(() =>
this.refresh({ type: 'all' })
)
);
this._disposables.add(
surfaceModel.localElementDeleted.subscribe(() => this.refresh())
surfaceModel.localElementDeleted.subscribe(() =>
this.refresh({ type: 'all' })
)
);
this._disposables.add(
surfaceModel.localElementUpdated.subscribe(() => this.refresh())
surfaceModel.localElementUpdated.subscribe(({ model }) => {
this.refresh({ type: 'element', element: model });
})
);
this._disposables.add(
surfaceModel.elementUpdated.subscribe(payload => {
// ignore externalXYWH update cause it's updated by the renderer
if (payload.props['externalXYWH']) return;
this.refresh();
const element = surfaceModel.getElementById(payload.id);
this.refresh(element ? { type: 'element', element } : { type: 'all' });
})
);
}
@@ -382,7 +827,7 @@ export class CanvasRenderer {
addOverlay(overlay: Overlay) {
overlay.setRenderer(this);
this._overlays.add(overlay);
this.refresh();
this.refresh({ type: 'main' });
}
/**
@@ -394,7 +839,7 @@ export class CanvasRenderer {
container.append(this.canvas);
this._resetSize();
this.refresh();
this.refresh({ type: 'all' });
}
dispose(): void {
@@ -453,8 +898,46 @@ export class CanvasRenderer {
return this.provider.getPropertyValue?.(property) ?? '';
}
refresh() {
if (this._refreshRafId !== null) return;
getDebugMetrics(): CanvasRendererDebugMetrics {
return {
...this._debugMetrics,
...this._lastDebugSnapshot,
canvasMemoryMegabytes:
this._lastDebugSnapshot.canvasMemoryBytes / 1024 / 1024,
};
}
resetDebugMetrics() {
this._debugMetrics = {
refreshCount: 0,
coalescedRefreshCount: 0,
renderCount: 0,
totalRenderDurationMs: 0,
lastRenderDurationMs: 0,
maxRenderDurationMs: 0,
lastRenderMetrics: this._createRenderPassStats(),
dirtyLayerRenderCount: 0,
fallbackElementCount: 0,
};
this._lastDebugSnapshot = {
canvasMemoryBytes: 0,
canvasMemorySnapshots: [],
canvasPixelCount: 0,
canvasLayerCount: 0,
pooledStackingCanvasCount: 0,
stackingCanvasCount: 0,
totalLayerCount: 0,
visibleStackingCanvasCount: 0,
};
}
refresh(target: RefreshTarget = { type: 'all' }) {
this._debugMetrics.refreshCount += 1;
this._invalidate(target);
if (this._refreshRafId !== null) {
this._debugMetrics.coalescedRefreshCount += 1;
return;
}
this._refreshRafId = requestConnectedFrame(() => {
this._refreshRafId = null;
@@ -469,6 +952,6 @@ export class CanvasRenderer {
overlay.setRenderer(null);
this._overlays.delete(overlay);
this.refresh();
this.refresh({ type: 'main' });
}
}

View File

@@ -354,30 +354,37 @@ export class DomRenderer {
this._disposables.add(
surfaceModel.elementAdded.subscribe(payload => {
this._markElementDirty(payload.id, UpdateType.ELEMENT_ADDED);
this._markViewportDirty();
this.refresh();
})
);
this._disposables.add(
surfaceModel.elementRemoved.subscribe(payload => {
this._markElementDirty(payload.id, UpdateType.ELEMENT_REMOVED);
this._markViewportDirty();
this.refresh();
})
);
this._disposables.add(
surfaceModel.localElementAdded.subscribe(payload => {
this._markElementDirty(payload.id, UpdateType.ELEMENT_ADDED);
this._markViewportDirty();
this.refresh();
})
);
this._disposables.add(
surfaceModel.localElementDeleted.subscribe(payload => {
this._markElementDirty(payload.id, UpdateType.ELEMENT_REMOVED);
this._markViewportDirty();
this.refresh();
})
);
this._disposables.add(
surfaceModel.localElementUpdated.subscribe(payload => {
this._markElementDirty(payload.model.id, UpdateType.ELEMENT_UPDATED);
if (payload.props['index'] || payload.props['groupId']) {
this._markViewportDirty();
}
this.refresh();
})
);
@@ -387,6 +394,9 @@ export class DomRenderer {
// ignore externalXYWH update cause it's updated by the renderer
if (payload.props['externalXYWH']) return;
this._markElementDirty(payload.id, UpdateType.ELEMENT_UPDATED);
if (payload.props['index'] || payload.props['childIds']) {
this._markViewportDirty();
}
this.refresh();
})
);

View File

@@ -5,6 +5,8 @@ import {
import type { BrushElementModel } from '@blocksuite/affine-model';
import { DefaultTheme } from '@blocksuite/affine-model';
import { renderBrushLikeDom } from './shared';
export const BrushDomRendererExtension = DomElementRendererExtension(
'brush',
(
@@ -12,58 +14,11 @@ export const BrushDomRendererExtension = DomElementRendererExtension(
domElement: HTMLElement,
renderer: DomRenderer
) => {
const { zoom } = renderer.viewport;
const [, , w, h] = model.deserializedXYWH;
// Early return if invalid dimensions
if (w <= 0 || h <= 0) {
return;
}
// Early return if no commands
if (!model.commands) {
return;
}
// Clear previous content
domElement.innerHTML = '';
// Get color value
const color = renderer.getColorValue(model.color, DefaultTheme.black, true);
// Create SVG element
const svg = document.createElementNS('http://www.w3.org/2000/svg', 'svg');
svg.style.position = 'absolute';
svg.style.left = '0';
svg.style.top = '0';
svg.style.width = `${w * zoom}px`;
svg.style.height = `${h * zoom}px`;
svg.style.overflow = 'visible';
svg.style.pointerEvents = 'none';
svg.setAttribute('viewBox', `0 0 ${w} ${h}`);
// Apply rotation transform
if (model.rotate !== 0) {
svg.style.transform = `rotate(${model.rotate}deg)`;
svg.style.transformOrigin = 'center';
}
// Create path element for the brush stroke
const pathElement = document.createElementNS(
'http://www.w3.org/2000/svg',
'path'
);
pathElement.setAttribute('d', model.commands);
pathElement.setAttribute('fill', color);
pathElement.setAttribute('stroke', 'none');
svg.append(pathElement);
domElement.replaceChildren(svg);
// Set element size and position
domElement.style.width = `${w * zoom}px`;
domElement.style.height = `${h * zoom}px`;
domElement.style.overflow = 'visible';
domElement.style.pointerEvents = 'none';
renderBrushLikeDom({
model,
domElement,
renderer,
color: renderer.getColorValue(model.color, DefaultTheme.black, true),
});
}
);

View File

@@ -5,6 +5,8 @@ import {
import type { HighlighterElementModel } from '@blocksuite/affine-model';
import { DefaultTheme } from '@blocksuite/affine-model';
import { renderBrushLikeDom } from './shared';
export const HighlighterDomRendererExtension = DomElementRendererExtension(
'highlighter',
(
@@ -12,62 +14,15 @@ export const HighlighterDomRendererExtension = DomElementRendererExtension(
domElement: HTMLElement,
renderer: DomRenderer
) => {
const { zoom } = renderer.viewport;
const [, , w, h] = model.deserializedXYWH;
// Early return if invalid dimensions
if (w <= 0 || h <= 0) {
return;
}
// Early return if no commands
if (!model.commands) {
return;
}
// Clear previous content
domElement.innerHTML = '';
// Get color value
const color = renderer.getColorValue(
model.color,
DefaultTheme.hightlighterColor,
true
);
// Create SVG element
const svg = document.createElementNS('http://www.w3.org/2000/svg', 'svg');
svg.style.position = 'absolute';
svg.style.left = '0';
svg.style.top = '0';
svg.style.width = `${w * zoom}px`;
svg.style.height = `${h * zoom}px`;
svg.style.overflow = 'visible';
svg.style.pointerEvents = 'none';
svg.setAttribute('viewBox', `0 0 ${w} ${h}`);
// Apply rotation transform
if (model.rotate !== 0) {
svg.style.transform = `rotate(${model.rotate}deg)`;
svg.style.transformOrigin = 'center';
}
// Create path element for the highlighter stroke
const pathElement = document.createElementNS(
'http://www.w3.org/2000/svg',
'path'
);
pathElement.setAttribute('d', model.commands);
pathElement.setAttribute('fill', color);
pathElement.setAttribute('stroke', 'none');
svg.append(pathElement);
domElement.replaceChildren(svg);
// Set element size and position
domElement.style.width = `${w * zoom}px`;
domElement.style.height = `${h * zoom}px`;
domElement.style.overflow = 'visible';
domElement.style.pointerEvents = 'none';
renderBrushLikeDom({
model,
domElement,
renderer,
color: renderer.getColorValue(
model.color,
DefaultTheme.hightlighterColor,
true
),
});
}
);

View File

@@ -0,0 +1,82 @@
import type { DomRenderer } from '@blocksuite/affine-block-surface';
import type {
BrushElementModel,
HighlighterElementModel,
} from '@blocksuite/affine-model';
const SVG_NS = 'http://www.w3.org/2000/svg';
type BrushLikeModel = BrushElementModel | HighlighterElementModel;
type RetainedBrushDom = {
path: SVGPathElement;
svg: SVGSVGElement;
};
const retainedBrushDom = new WeakMap<HTMLElement, RetainedBrushDom>();
function clearBrushLikeDom(domElement: HTMLElement) {
retainedBrushDom.delete(domElement);
domElement.replaceChildren();
}
function getRetainedBrushDom(domElement: HTMLElement) {
const existing = retainedBrushDom.get(domElement);
if (existing) {
return existing;
}
const svg = document.createElementNS(SVG_NS, 'svg');
svg.style.position = 'absolute';
svg.style.left = '0';
svg.style.top = '0';
svg.style.overflow = 'visible';
svg.style.pointerEvents = 'none';
const path = document.createElementNS(SVG_NS, 'path');
path.setAttribute('stroke', 'none');
svg.append(path);
const retained = { svg, path };
retainedBrushDom.set(domElement, retained);
domElement.replaceChildren(svg);
return retained;
}
export function renderBrushLikeDom({
color,
domElement,
model,
renderer,
}: {
color: string;
domElement: HTMLElement;
model: BrushLikeModel;
renderer: DomRenderer;
}) {
const { zoom } = renderer.viewport;
const [, , w, h] = model.deserializedXYWH;
if (w <= 0 || h <= 0 || !model.commands) {
clearBrushLikeDom(domElement);
return;
}
const { path, svg } = getRetainedBrushDom(domElement);
svg.style.width = `${w * zoom}px`;
svg.style.height = `${h * zoom}px`;
svg.style.transform = model.rotate === 0 ? '' : `rotate(${model.rotate}deg)`;
svg.style.transformOrigin = model.rotate === 0 ? '' : 'center';
svg.setAttribute('viewBox', `0 0 ${w} ${h}`);
path.setAttribute('d', model.commands);
path.setAttribute('fill', color);
domElement.style.width = `${w * zoom}px`;
domElement.style.height = `${h * zoom}px`;
domElement.style.overflow = 'visible';
domElement.style.pointerEvents = 'none';
}

View File

@@ -14,6 +14,8 @@ import { PointLocation, SVGPathBuilder } from '@blocksuite/global/gfx';
import { isConnectorWithLabel } from '../connector-manager';
import { DEFAULT_ARROW_SIZE } from './utils';
const SVG_NS = 'http://www.w3.org/2000/svg';
interface PathBounds {
minX: number;
minY: number;
@@ -21,6 +23,15 @@ interface PathBounds {
maxY: number;
}
type RetainedConnectorDom = {
defs: SVGDefsElement;
label: HTMLDivElement | null;
path: SVGPathElement;
svg: SVGSVGElement;
};
const retainedConnectorDom = new WeakMap<HTMLElement, RetainedConnectorDom>();
function calculatePathBounds(path: PointLocation[]): PathBounds {
if (path.length === 0) {
return { minX: 0, minY: 0, maxX: 0, maxY: 0 };
@@ -81,10 +92,7 @@ function createArrowMarker(
strokeWidth: number,
isStart: boolean = false
): SVGMarkerElement {
const marker = document.createElementNS(
'http://www.w3.org/2000/svg',
'marker'
);
const marker = document.createElementNS(SVG_NS, 'marker');
const size = DEFAULT_ARROW_SIZE * (strokeWidth / 2);
marker.id = id;
@@ -98,10 +106,7 @@ function createArrowMarker(
switch (style) {
case 'Arrow': {
const path = document.createElementNS(
'http://www.w3.org/2000/svg',
'path'
);
const path = document.createElementNS(SVG_NS, 'path');
path.setAttribute(
'd',
isStart ? 'M 20 5 L 10 10 L 20 15 Z' : 'M 0 5 L 10 10 L 0 15 Z'
@@ -112,10 +117,7 @@ function createArrowMarker(
break;
}
case 'Triangle': {
const path = document.createElementNS(
'http://www.w3.org/2000/svg',
'path'
);
const path = document.createElementNS(SVG_NS, 'path');
path.setAttribute(
'd',
isStart ? 'M 20 7 L 12 10 L 20 13 Z' : 'M 0 7 L 8 10 L 0 13 Z'
@@ -126,10 +128,7 @@ function createArrowMarker(
break;
}
case 'Circle': {
const circle = document.createElementNS(
'http://www.w3.org/2000/svg',
'circle'
);
const circle = document.createElementNS(SVG_NS, 'circle');
circle.setAttribute('cx', '10');
circle.setAttribute('cy', '10');
circle.setAttribute('r', '4');
@@ -139,10 +138,7 @@ function createArrowMarker(
break;
}
case 'Diamond': {
const path = document.createElementNS(
'http://www.w3.org/2000/svg',
'path'
);
const path = document.createElementNS(SVG_NS, 'path');
path.setAttribute('d', 'M 10 6 L 14 10 L 10 14 L 6 10 Z');
path.setAttribute('fill', color);
path.setAttribute('stroke', color);
@@ -154,13 +150,64 @@ function createArrowMarker(
return marker;
}
function clearRetainedConnectorDom(element: HTMLElement) {
retainedConnectorDom.delete(element);
element.replaceChildren();
}
function getRetainedConnectorDom(element: HTMLElement): RetainedConnectorDom {
const existing = retainedConnectorDom.get(element);
if (existing) {
return existing;
}
const svg = document.createElementNS(SVG_NS, 'svg');
svg.style.position = 'absolute';
svg.style.overflow = 'visible';
svg.style.pointerEvents = 'none';
const defs = document.createElementNS(SVG_NS, 'defs');
const path = document.createElementNS(SVG_NS, 'path');
path.setAttribute('fill', 'none');
path.setAttribute('stroke-linecap', 'round');
path.setAttribute('stroke-linejoin', 'round');
svg.append(defs, path);
element.replaceChildren(svg);
const retained = {
svg,
defs,
path,
label: null,
};
retainedConnectorDom.set(element, retained);
return retained;
}
function getOrCreateLabelElement(retained: RetainedConnectorDom) {
if (retained.label) {
return retained.label;
}
const label = document.createElement('div');
retained.svg.insertAdjacentElement('afterend', label);
retained.label = label;
return label;
}
function renderConnectorLabel(
model: ConnectorElementModel,
container: HTMLElement,
retained: RetainedConnectorDom,
renderer: DomRenderer,
zoom: number
) {
if (!isConnectorWithLabel(model) || !model.labelXYWH) {
retained.label?.remove();
retained.label = null;
return;
}
@@ -176,8 +223,7 @@ function renderConnectorLabel(
},
} = model;
// Create label element
const labelElement = document.createElement('div');
const labelElement = getOrCreateLabelElement(retained);
labelElement.style.position = 'absolute';
labelElement.style.left = `${lx * zoom}px`;
labelElement.style.top = `${ly * zoom}px`;
@@ -210,11 +256,7 @@ function renderConnectorLabel(
labelElement.style.wordWrap = 'break-word';
// Add text content
if (model.text) {
labelElement.textContent = model.text.toString();
}
container.append(labelElement);
labelElement.textContent = model.text ? model.text.toString() : '';
}
/**
@@ -241,14 +283,13 @@ export const connectorBaseDomRenderer = (
stroke,
} = model;
// Clear previous content
element.innerHTML = '';
// Early return if no path points
if (!points || points.length < 2) {
clearRetainedConnectorDom(element);
return;
}
const retained = getRetainedConnectorDom(element);
// Calculate bounds for the SVG viewBox
const pathBounds = calculatePathBounds(points);
const padding = Math.max(strokeWidth * 2, 20); // Add padding for arrows
@@ -257,8 +298,7 @@ export const connectorBaseDomRenderer = (
const offsetX = pathBounds.minX - padding;
const offsetY = pathBounds.minY - padding;
// Create SVG element
const svg = document.createElementNS('http://www.w3.org/2000/svg', 'svg');
const { defs, path, svg } = retained;
svg.style.position = 'absolute';
svg.style.left = `${offsetX * zoom}px`;
svg.style.top = `${offsetY * zoom}px`;
@@ -268,49 +308,43 @@ export const connectorBaseDomRenderer = (
svg.style.pointerEvents = 'none';
svg.setAttribute('viewBox', `0 0 ${svgWidth / zoom} ${svgHeight / zoom}`);
// Create defs for markers
const defs = document.createElementNS('http://www.w3.org/2000/svg', 'defs');
svg.append(defs);
const strokeColor = renderer.getColorValue(
stroke,
DefaultTheme.connectorColor,
true
);
// Create markers for endpoints
const markers: SVGMarkerElement[] = [];
let startMarkerId = '';
let endMarkerId = '';
if (frontEndpointStyle !== 'None') {
startMarkerId = `start-marker-${model.id}`;
const startMarker = createArrowMarker(
startMarkerId,
frontEndpointStyle,
strokeColor,
strokeWidth,
true
markers.push(
createArrowMarker(
startMarkerId,
frontEndpointStyle,
strokeColor,
strokeWidth,
true
)
);
defs.append(startMarker);
}
if (rearEndpointStyle !== 'None') {
endMarkerId = `end-marker-${model.id}`;
const endMarker = createArrowMarker(
endMarkerId,
rearEndpointStyle,
strokeColor,
strokeWidth,
false
markers.push(
createArrowMarker(
endMarkerId,
rearEndpointStyle,
strokeColor,
strokeWidth,
false
)
);
defs.append(endMarker);
}
// Create path element
const pathElement = document.createElementNS(
'http://www.w3.org/2000/svg',
'path'
);
defs.replaceChildren(...markers);
// Adjust points relative to the SVG coordinate system
const adjustedPoints = points.map(point => {
@@ -334,29 +368,25 @@ export const connectorBaseDomRenderer = (
});
const pathData = createConnectorPath(adjustedPoints, mode);
pathElement.setAttribute('d', pathData);
pathElement.setAttribute('stroke', strokeColor);
pathElement.setAttribute('stroke-width', String(strokeWidth));
pathElement.setAttribute('fill', 'none');
pathElement.setAttribute('stroke-linecap', 'round');
pathElement.setAttribute('stroke-linejoin', 'round');
// Apply stroke style
path.setAttribute('d', pathData);
path.setAttribute('stroke', strokeColor);
path.setAttribute('stroke-width', String(strokeWidth));
if (strokeStyle === 'dash') {
pathElement.setAttribute('stroke-dasharray', '12,12');
path.setAttribute('stroke-dasharray', '12,12');
} else {
path.removeAttribute('stroke-dasharray');
}
// Apply markers
if (startMarkerId) {
pathElement.setAttribute('marker-start', `url(#${startMarkerId})`);
path.setAttribute('marker-start', `url(#${startMarkerId})`);
} else {
path.removeAttribute('marker-start');
}
if (endMarkerId) {
pathElement.setAttribute('marker-end', `url(#${endMarkerId})`);
path.setAttribute('marker-end', `url(#${endMarkerId})`);
} else {
path.removeAttribute('marker-end');
}
svg.append(pathElement);
element.append(svg);
// Set element size and position
element.style.width = `${model.w * zoom}px`;
element.style.height = `${model.h * zoom}px`;
@@ -370,7 +400,11 @@ export const connectorDomRenderer = (
renderer: DomRenderer
): void => {
connectorBaseDomRenderer(model, element, renderer);
renderConnectorLabel(model, element, renderer, renderer.viewport.zoom);
const retained = retainedConnectorDom.get(element);
if (!retained) return;
renderConnectorLabel(model, retained, renderer, renderer.viewport.zoom);
};
/**

View File

@@ -6,6 +6,37 @@ import { SVGShapeBuilder } from '@blocksuite/global/gfx';
import { manageClassNames, setStyles } from './utils';
const SVG_NS = 'http://www.w3.org/2000/svg';
type RetainedShapeDom = {
polygon: SVGPolygonElement | null;
svg: SVGSVGElement | null;
text: HTMLDivElement | null;
};
type RetainedShapeSvg = {
polygon: SVGPolygonElement;
svg: SVGSVGElement;
};
const retainedShapeDom = new WeakMap<HTMLElement, RetainedShapeDom>();
function getRetainedShapeDom(element: HTMLElement): RetainedShapeDom {
const existing = retainedShapeDom.get(element);
if (existing) {
return existing;
}
const retained = {
svg: null,
polygon: null,
text: null,
};
retainedShapeDom.set(element, retained);
return retained;
}
function applyShapeSpecificStyles(
model: ShapeElementModel,
element: HTMLElement,
@@ -14,10 +45,6 @@ function applyShapeSpecificStyles(
// Reset properties that might be set by different shape types
element.style.removeProperty('clip-path');
element.style.removeProperty('border-radius');
// Clear DOM for shapes that don't use SVG, or if type changes from SVG-based to non-SVG-based
if (model.shapeType !== 'diamond' && model.shapeType !== 'triangle') {
while (element.firstChild) element.firstChild.remove();
}
switch (model.shapeType) {
case 'rect': {
@@ -42,6 +69,54 @@ function applyShapeSpecificStyles(
// No 'else' needed to clear styles, as they are reset at the beginning of the function.
}
function getOrCreateSvg(
retained: RetainedShapeDom,
element: HTMLElement
): RetainedShapeSvg {
if (retained.svg && retained.polygon) {
return {
svg: retained.svg,
polygon: retained.polygon,
};
}
const svg = document.createElementNS(SVG_NS, 'svg');
svg.setAttribute('width', '100%');
svg.setAttribute('height', '100%');
svg.setAttribute('preserveAspectRatio', 'none');
const polygon = document.createElementNS(SVG_NS, 'polygon');
svg.append(polygon);
retained.svg = svg;
retained.polygon = polygon;
element.prepend(svg);
return { svg, polygon };
}
function removeSvg(retained: RetainedShapeDom) {
retained.svg?.remove();
retained.svg = null;
retained.polygon = null;
}
function getOrCreateText(retained: RetainedShapeDom, element: HTMLElement) {
if (retained.text) {
return retained.text;
}
const text = document.createElement('div');
retained.text = text;
element.append(text);
return text;
}
function removeText(retained: RetainedShapeDom) {
retained.text?.remove();
retained.text = null;
}
function applyBorderStyles(
model: ShapeElementModel,
element: HTMLElement,
@@ -99,8 +174,7 @@ export const shapeDomRenderer = (
const { zoom } = renderer.viewport;
const unscaledWidth = model.w;
const unscaledHeight = model.h;
const newChildren: Element[] = [];
const retained = getRetainedShapeDom(element);
const fillColor = renderer.getColorValue(
model.fillColor,
@@ -124,6 +198,7 @@ export const shapeDomRenderer = (
// For diamond and triangle, fill and border are handled by inline SVG
element.style.border = 'none'; // Ensure no standard CSS border interferes
element.style.backgroundColor = 'transparent'; // Host element is transparent
const { polygon, svg } = getOrCreateSvg(retained, element);
const strokeW = model.strokeWidth;
@@ -155,37 +230,30 @@ export const shapeDomRenderer = (
// Determine fill color
const finalFillColor = model.filled ? fillColor : 'transparent';
// Build SVG safely with DOM-API
const SVG_NS = 'http://www.w3.org/2000/svg';
const svg = document.createElementNS(SVG_NS, 'svg');
svg.setAttribute('width', '100%');
svg.setAttribute('height', '100%');
svg.setAttribute('viewBox', `0 0 ${unscaledWidth} ${unscaledHeight}`);
svg.setAttribute('preserveAspectRatio', 'none');
const polygon = document.createElementNS(SVG_NS, 'polygon');
polygon.setAttribute('points', svgPoints);
polygon.setAttribute('fill', finalFillColor);
polygon.setAttribute('stroke', finalStrokeColor);
polygon.setAttribute('stroke-width', String(strokeW));
if (finalStrokeDasharray !== 'none') {
polygon.setAttribute('stroke-dasharray', finalStrokeDasharray);
} else {
polygon.removeAttribute('stroke-dasharray');
}
svg.append(polygon);
newChildren.push(svg);
} else {
// Standard rendering for other shapes (e.g., rect, ellipse)
// innerHTML was already cleared by applyShapeSpecificStyles if necessary
removeSvg(retained);
element.style.backgroundColor = model.filled ? fillColor : 'transparent';
applyBorderStyles(model, element, strokeColor, zoom); // Uses standard CSS border
}
if (model.textDisplay && model.text) {
const str = model.text.toString();
const textElement = document.createElement('div');
const textElement = getOrCreateText(retained, element);
if (isRTL(str)) {
textElement.dir = 'rtl';
} else {
textElement.removeAttribute('dir');
}
textElement.style.position = 'absolute';
textElement.style.inset = '0';
@@ -210,12 +278,10 @@ export const shapeDomRenderer = (
true
);
textElement.textContent = str;
newChildren.push(textElement);
} else {
removeText(retained);
}
// Replace existing children to avoid memory leaks
element.replaceChildren(...newChildren);
applyTransformStyles(model, element);
manageClassNames(model, element);

View File

@@ -177,6 +177,11 @@ export class ConnectorElementModel extends GfxPrimitiveElementModel<ConnectorEle
override getNearestPoint(point: IVec): IVec {
const { mode, absolutePath: path } = this;
if (path.length === 0) {
const { x, y } = this;
return [x, y];
}
if (mode === ConnectorMode.Straight) {
const first = path[0];
const last = path[path.length - 1];
@@ -213,6 +218,10 @@ export class ConnectorElementModel extends GfxPrimitiveElementModel<ConnectorEle
h = bounds.h;
}
if (path.length === 0) {
return 0.5;
}
point[0] = Vec.clamp(point[0], x, x + w);
point[1] = Vec.clamp(point[1], y, y + h);
@@ -258,6 +267,10 @@ export class ConnectorElementModel extends GfxPrimitiveElementModel<ConnectorEle
h = bounds.h;
}
if (path.length === 0) {
return [x + w / 2, y + h / 2];
}
if (mode === ConnectorMode.Orthogonal) {
const points = path.map<IVec>(p => [p[0], p[1]]);
const point = Polyline.pointAt(points, offsetDistance);
@@ -300,6 +313,10 @@ export class ConnectorElementModel extends GfxPrimitiveElementModel<ConnectorEle
const { mode, strokeWidth, absolutePath: path } = this;
if (path.length === 0) {
return false;
}
const point =
mode === ConnectorMode.Curve
? getBezierNearestPoint(getBezierParameters(path), currentPoint)

View File

@@ -0,0 +1,22 @@
import { describe, expect, test } from 'vitest';
import { getBezierParameters } from '../gfx/curve.js';
import { PointLocation } from '../gfx/model/index.js';
describe('getBezierParameters', () => {
test('should handle empty path', () => {
expect(() => getBezierParameters([])).not.toThrow();
expect(getBezierParameters([])).toEqual([
new PointLocation(),
new PointLocation(),
new PointLocation(),
new PointLocation(),
]);
});
test('should handle single-point path', () => {
const point = new PointLocation([10, 20]);
expect(getBezierParameters([point])).toEqual([point, point, point, point]);
});
});

View File

@@ -142,6 +142,11 @@ export function getBezierNearestPoint(
export function getBezierParameters(
points: PointLocation[]
): BezierCurveParameters {
if (points.length === 0) {
const point = new PointLocation();
return [point, point, point, point];
}
// Fallback for degenerate Bezier curve (all points are at the same position)
if (points.length === 1) {
const point = points[0];

View File

@@ -596,7 +596,7 @@ export class LayerManager extends GfxExtension {
private _updateLayer(
element: GfxModel | GfxLocalElementModel,
props?: Record<string, unknown>,
oldValues?: Record<string, unknown>
_oldValues?: Record<string, unknown>
) {
const modelType = this._getModelType(element);
const isLocalElem = element instanceof GfxLocalElementModel;
@@ -613,16 +613,7 @@ export class LayerManager extends GfxExtension {
};
if (shouldUpdateGroupChildren) {
const group = element as GfxModel & GfxGroupCompatibleInterface;
const oldChildIds = childIdsChanged
? Array.isArray(oldValues?.['childIds'])
? (oldValues['childIds'] as string[])
: this._groupChildSnapshot.get(group.id)
: undefined;
const relatedElements = this._getRelatedGroupElements(group, oldChildIds);
this._refreshElementsInLayer(relatedElements);
this._syncGroupChildSnapshot(group);
this._reset();
return true;
}

View File

@@ -31,6 +31,13 @@ function updateTransform(element: GfxBlockComponent) {
element.style.transform = element.getCSSTransform();
}
function updateZIndex(element: GfxBlockComponent) {
const zIndex = element.toZIndex();
if (element.style.zIndex !== zIndex) {
element.style.zIndex = zIndex;
}
}
function updateBlockVisibility(view: GfxBlockComponent) {
if (view.transformState$.value === 'active') {
view.style.visibility = 'visible';
@@ -58,14 +65,22 @@ function handleGfxConnection(instance: GfxBlockComponent) {
instance.store.slots.blockUpdated.subscribe(({ type, id }) => {
if (id === instance.model.id && type === 'update') {
updateTransform(instance);
updateZIndex(instance);
}
})
);
instance.disposables.add(
instance.gfx.layer.slots.layerUpdated.subscribe(() => {
updateZIndex(instance);
})
);
instance.disposables.add(
effect(() => {
updateBlockVisibility(instance);
updateTransform(instance);
updateZIndex(instance);
})
);
}

View File

@@ -6,6 +6,7 @@ import type {
import { ungroupCommand } from '@blocksuite/affine/gfx/group';
import type {
GroupElementModel,
MindmapElementModel,
NoteBlockModel,
} from '@blocksuite/affine/model';
import { generateKeyBetween } from '@blocksuite/affine/std/gfx';
@@ -253,6 +254,40 @@ test('blocks should rerender when their z-index changed', async () => {
assertBlocksContent();
});
test('block host z-index should update after reordering', async () => {
const backId = addNote(doc);
const frontId = addNote(doc);
await wait();
const getBlockHost = (id: string) =>
document.querySelector<HTMLElement>(
`affine-edgeless-root gfx-viewport > [data-block-id="${id}"]`
);
const backHost = getBlockHost(backId);
const frontHost = getBlockHost(frontId);
expect(backHost).not.toBeNull();
expect(frontHost).not.toBeNull();
expect(Number(backHost!.style.zIndex)).toBeLessThan(
Number(frontHost!.style.zIndex)
);
service.crud.updateElement(backId, {
index: service.layer.getReorderedIndex(
service.crud.getElementById(backId)!,
'front'
),
});
await wait();
expect(Number(backHost!.style.zIndex)).toBeGreaterThan(
Number(frontHost!.style.zIndex)
);
});
describe('layer reorder functionality', () => {
let ids: string[] = [];
@@ -428,14 +463,17 @@ describe('group related functionality', () => {
const elements = [
service.crud.addElement('shape', {
shapeType: 'rect',
xywh: '[0,0,100,100]',
})!,
addNote(doc),
service.crud.addElement('shape', {
shapeType: 'rect',
xywh: '[120,0,100,100]',
})!,
addNote(doc),
service.crud.addElement('shape', {
shapeType: 'rect',
xywh: '[240,0,100,100]',
})!,
];
@@ -528,6 +566,35 @@ describe('group related functionality', () => {
expect(service.layer.layers[1].elements[0]).toBe(group);
});
test("change mindmap index should update its nodes' layer", async () => {
const noteId = addNote(doc);
const mindmapId = service.crud.addElement('mindmap', {
children: {
text: 'root',
children: [{ text: 'child' }],
},
})!;
await wait();
const note = service.crud.getElementById(noteId)!;
const mindmap = service.crud.getElementById(
mindmapId
)! as MindmapElementModel;
const root = mindmap.tree.element;
expect(service.layer.getZIndex(root)).toBeGreaterThan(
service.layer.getZIndex(note)
);
mindmap.index = service.layer.getReorderedIndex(mindmap, 'back');
await wait();
expect(service.layer.getZIndex(root)).toBeLessThan(
service.layer.getZIndex(note)
);
});
test('should keep relative index order of elements after group, ungroup, undo, redo', () => {
const edgeless = getDocRootBlock(doc, editor, 'edgeless');
const elementIds = [
@@ -769,6 +836,7 @@ test('indexed canvas should be inserted into edgeless portal when switch to edge
service.crud.addElement('shape', {
shapeType: 'rect',
xywh: '[0,0,100,100]',
})!;
addNote(doc);
@@ -777,6 +845,7 @@ test('indexed canvas should be inserted into edgeless portal when switch to edge
service.crud.addElement('shape', {
shapeType: 'rect',
xywh: '[120,0,100,100]',
})!;
editor.mode = 'page';
@@ -792,10 +861,10 @@ test('indexed canvas should be inserted into edgeless portal when switch to edge
'.indexable-canvas'
)[0] as HTMLCanvasElement;
expect(indexedCanvas.width).toBe(
expect(indexedCanvas.width).toBeLessThanOrEqual(
(surface.renderer as CanvasRenderer).canvas.width
);
expect(indexedCanvas.height).toBe(
expect(indexedCanvas.height).toBeLessThanOrEqual(
(surface.renderer as CanvasRenderer).canvas.height
);
expect(indexedCanvas.width).not.toBe(0);

View File

@@ -21,7 +21,10 @@ image = { workspace = true }
infer = { workspace = true }
libwebp-sys = { workspace = true }
little_exif = { workspace = true }
llm_adapter = { workspace = true }
llm_adapter = { workspace = true, default-features = false, features = [
"ureq-client",
] }
matroska = { workspace = true }
mp4parse = { workspace = true }
napi = { workspace = true, features = ["async"] }
napi-derive = { workspace = true }

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -54,6 +54,12 @@ export declare function llmDispatch(protocol: string, backendConfigJson: string,
export declare function llmDispatchStream(protocol: string, backendConfigJson: string, requestJson: string, callback: ((err: Error | null, arg: string) => void)): LlmStreamHandle
export declare function llmEmbeddingDispatch(protocol: string, backendConfigJson: string, requestJson: string): string
export declare function llmRerankDispatch(protocol: string, backendConfigJson: string, requestJson: string): string
export declare function llmStructuredDispatch(protocol: string, backendConfigJson: string, requestJson: string): string
/**
* Merge updates in form like `Y.applyUpdate(doc, update)` way and return the
* result binary.

View File

@@ -1,3 +1,4 @@
use matroska::Matroska;
use mp4parse::{TrackType, read_mp4};
use napi_derive::napi;
@@ -8,7 +9,13 @@ pub fn get_mime(input: &[u8]) -> String {
} else {
file_format::FileFormat::from_bytes(input).media_type().to_string()
};
if mimetype == "video/mp4" {
if let Some(container) = matroska_container_kind(input).or(match mimetype.as_str() {
"video/webm" | "application/webm" => Some(ContainerKind::WebM),
"video/x-matroska" | "application/x-matroska" => Some(ContainerKind::Matroska),
_ => None,
}) {
detect_matroska_flavor(input, container, &mimetype)
} else if mimetype == "video/mp4" {
detect_mp4_flavor(input)
} else {
mimetype
@@ -37,3 +44,68 @@ fn detect_mp4_flavor(input: &[u8]) -> String {
Err(_) => "video/mp4".to_string(),
}
}
#[derive(Clone, Copy)]
enum ContainerKind {
WebM,
Matroska,
}
impl ContainerKind {
fn audio_mime(&self) -> &'static str {
match self {
ContainerKind::WebM => "audio/webm",
ContainerKind::Matroska => "audio/x-matroska",
}
}
}
fn detect_matroska_flavor(input: &[u8], container: ContainerKind, fallback: &str) -> String {
match Matroska::open(std::io::Cursor::new(input)) {
Ok(file) => {
let has_video = file.video_tracks().next().is_some();
let has_audio = file.audio_tracks().next().is_some();
if !has_video && has_audio {
container.audio_mime().to_string()
} else {
fallback.to_string()
}
}
Err(_) => fallback.to_string(),
}
}
fn matroska_container_kind(input: &[u8]) -> Option<ContainerKind> {
let header = &input[..1024.min(input.len())];
if header.windows(4).any(|window| window.eq_ignore_ascii_case(b"webm")) {
Some(ContainerKind::WebM)
} else if header.windows(8).any(|window| window.eq_ignore_ascii_case(b"matroska")) {
Some(ContainerKind::Matroska)
} else {
None
}
}
#[cfg(test)]
mod tests {
use super::*;
const AUDIO_ONLY_WEBM: &[u8] = include_bytes!("../fixtures/audio-only.webm");
const AUDIO_VIDEO_WEBM: &[u8] = include_bytes!("../fixtures/audio-video.webm");
const AUDIO_ONLY_MATROSKA: &[u8] = include_bytes!("../fixtures/audio-only.mka");
#[test]
fn detects_audio_only_webm_as_audio() {
assert_eq!(get_mime(AUDIO_ONLY_WEBM), "audio/webm");
}
#[test]
fn preserves_video_webm() {
assert_eq!(get_mime(AUDIO_VIDEO_WEBM), "video/webm");
}
#[test]
fn detects_audio_only_matroska_as_audio() {
assert_eq!(get_mime(AUDIO_ONLY_MATROSKA), "audio/x-matroska");
}
}

View File

@@ -5,9 +5,10 @@ use std::sync::{
use llm_adapter::{
backend::{
BackendConfig, BackendError, BackendProtocol, ReqwestHttpClient, dispatch_request, dispatch_stream_events_with,
BackendConfig, BackendError, BackendProtocol, DefaultHttpClient, dispatch_embedding_request, dispatch_request,
dispatch_rerank_request, dispatch_stream_events_with, dispatch_structured_request,
},
core::{CoreRequest, StreamEvent},
core::{CoreRequest, EmbeddingRequest, RerankRequest, StreamEvent, StructuredRequest},
middleware::{
MiddlewareConfig, PipelineContext, RequestMiddleware, StreamMiddleware, citation_indexing, clamp_max_tokens,
normalize_messages, run_request_middleware_chain, run_stream_middleware_chain, stream_event_normalize,
@@ -40,6 +41,20 @@ struct LlmDispatchPayload {
middleware: LlmMiddlewarePayload,
}
#[derive(Debug, Clone, Deserialize)]
struct LlmStructuredDispatchPayload {
#[serde(flatten)]
request: StructuredRequest,
#[serde(default)]
middleware: LlmMiddlewarePayload,
}
#[derive(Debug, Clone, Deserialize)]
struct LlmRerankDispatchPayload {
#[serde(flatten)]
request: RerankRequest,
}
#[napi]
pub struct LlmStreamHandle {
aborted: Arc<AtomicBool>,
@@ -61,7 +76,44 @@ pub fn llm_dispatch(protocol: String, backend_config_json: String, request_json:
let request = apply_request_middlewares(payload.request, &payload.middleware)?;
let response =
dispatch_request(&ReqwestHttpClient::default(), &config, protocol, &request).map_err(map_backend_error)?;
dispatch_request(&DefaultHttpClient::default(), &config, protocol, &request).map_err(map_backend_error)?;
serde_json::to_string(&response).map_err(map_json_error)
}
#[napi(catch_unwind)]
pub fn llm_structured_dispatch(protocol: String, backend_config_json: String, request_json: String) -> Result<String> {
let protocol = parse_protocol(&protocol)?;
let config: BackendConfig = serde_json::from_str(&backend_config_json).map_err(map_json_error)?;
let payload: LlmStructuredDispatchPayload = serde_json::from_str(&request_json).map_err(map_json_error)?;
let request = apply_structured_request_middlewares(payload.request, &payload.middleware)?;
let response = dispatch_structured_request(&DefaultHttpClient::default(), &config, protocol, &request)
.map_err(map_backend_error)?;
serde_json::to_string(&response).map_err(map_json_error)
}
#[napi(catch_unwind)]
pub fn llm_embedding_dispatch(protocol: String, backend_config_json: String, request_json: String) -> Result<String> {
let protocol = parse_protocol(&protocol)?;
let config: BackendConfig = serde_json::from_str(&backend_config_json).map_err(map_json_error)?;
let request: EmbeddingRequest = serde_json::from_str(&request_json).map_err(map_json_error)?;
let response = dispatch_embedding_request(&DefaultHttpClient::default(), &config, protocol, &request)
.map_err(map_backend_error)?;
serde_json::to_string(&response).map_err(map_json_error)
}
#[napi(catch_unwind)]
pub fn llm_rerank_dispatch(protocol: String, backend_config_json: String, request_json: String) -> Result<String> {
let protocol = parse_protocol(&protocol)?;
let config: BackendConfig = serde_json::from_str(&backend_config_json).map_err(map_json_error)?;
let payload: LlmRerankDispatchPayload = serde_json::from_str(&request_json).map_err(map_json_error)?;
let response = dispatch_rerank_request(&DefaultHttpClient::default(), &config, protocol, &payload.request)
.map_err(map_backend_error)?;
serde_json::to_string(&response).map_err(map_json_error)
}
@@ -98,7 +150,7 @@ pub fn llm_dispatch_stream(
let mut aborted_by_user = false;
let mut callback_dispatch_failed = false;
let result = dispatch_stream_events_with(&ReqwestHttpClient::default(), &config, protocol, &request, |event| {
let result = dispatch_stream_events_with(&DefaultHttpClient::default(), &config, protocol, &request, |event| {
if aborted_in_worker.load(Ordering::Relaxed) {
aborted_by_user = true;
return Err(BackendError::Http(STREAM_ABORTED_REASON.to_string()));
@@ -155,6 +207,27 @@ fn apply_request_middlewares(request: CoreRequest, middleware: &LlmMiddlewarePay
Ok(run_request_middleware_chain(request, &middleware.config, &chain))
}
fn apply_structured_request_middlewares(
request: StructuredRequest,
middleware: &LlmMiddlewarePayload,
) -> Result<StructuredRequest> {
let mut core = request.as_core_request();
core = apply_request_middlewares(core, middleware)?;
Ok(StructuredRequest {
model: core.model,
messages: core.messages,
schema: core
.response_schema
.ok_or_else(|| Error::new(Status::InvalidArg, "Structured request schema is required"))?,
max_tokens: core.max_tokens,
temperature: core.temperature,
reasoning: core.reasoning,
strict: request.strict,
response_mime_type: request.response_mime_type,
})
}
#[derive(Clone)]
struct StreamPipeline {
chain: Vec<StreamMiddleware>,
@@ -268,6 +341,7 @@ fn parse_protocol(protocol: &str) -> Result<BackendProtocol> {
}
"openai_responses" | "openai-responses" | "responses" => Ok(BackendProtocol::OpenaiResponses),
"anthropic" | "anthropic_messages" | "anthropic-messages" => Ok(BackendProtocol::AnthropicMessages),
"gemini" | "gemini_generate_content" | "gemini-generate-content" => Ok(BackendProtocol::GeminiGenerateContent),
other => Err(Error::new(
Status::InvalidArg,
format!("Unsupported llm backend protocol: {other}"),
@@ -293,6 +367,7 @@ mod tests {
assert!(parse_protocol("chat-completions").is_ok());
assert!(parse_protocol("responses").is_ok());
assert!(parse_protocol("anthropic").is_ok());
assert!(parse_protocol("gemini").is_ok());
}
#[test]

View File

@@ -25,21 +25,19 @@
"dependencies": {
"@affine/s3-compat": "workspace:*",
"@affine/server-native": "workspace:*",
"@ai-sdk/google": "^2.0.45",
"@ai-sdk/google-vertex": "^3.0.88",
"@apollo/server": "^4.13.0",
"@fal-ai/serverless-client": "^0.15.0",
"@google-cloud/opentelemetry-cloud-trace-exporter": "^3.0.0",
"@google-cloud/opentelemetry-resource-util": "^3.0.0",
"@nestjs-cls/transactional": "^2.7.0",
"@nestjs-cls/transactional-adapter-prisma": "^1.2.24",
"@nestjs/apollo": "^13.0.4",
"@nestjs-cls/transactional": "^3.2.0",
"@nestjs-cls/transactional-adapter-prisma": "^1.3.4",
"@nestjs/apollo": "^13.2.4",
"@nestjs/bullmq": "^11.0.4",
"@nestjs/common": "^11.0.21",
"@nestjs/core": "^11.1.14",
"@nestjs/graphql": "^13.0.4",
"@nestjs/platform-express": "^11.1.14",
"@nestjs/platform-socket.io": "^11.1.14",
"@nestjs/common": "^11.1.16",
"@nestjs/core": "^11.1.16",
"@nestjs/graphql": "^13.2.4",
"@nestjs/platform-express": "^11.1.16",
"@nestjs/platform-socket.io": "^11.1.16",
"@nestjs/schedule": "^6.1.1",
"@nestjs/throttler": "^6.5.0",
"@nestjs/websockets": "^11.1.14",
@@ -66,7 +64,6 @@
"@queuedash/api": "^3.16.0",
"@react-email/components": "^0.5.7",
"@socket.io/redis-adapter": "^8.3.0",
"ai": "^5.0.118",
"bullmq": "^5.40.2",
"cookie-parser": "^1.4.7",
"cross-env": "^10.1.0",

View File

@@ -118,7 +118,6 @@ test.serial.before(async t => {
enabled: true,
scenarios: {
image: 'flux-1/schnell',
rerank: 'gpt-5-mini',
complex_text_generation: 'gpt-5-mini',
coding: 'gpt-5-mini',
quick_decision_making: 'gpt-5-mini',
@@ -226,6 +225,20 @@ const checkStreamObjects = (result: string) => {
}
};
const parseStreamObjects = (result: string): StreamObject[] => {
const streamObjects = JSON.parse(result);
return z.array(StreamObjectSchema).parse(streamObjects);
};
const getStreamObjectText = (result: string) =>
parseStreamObjects(result)
.filter(
(chunk): chunk is Extract<StreamObject, { type: 'text-delta' }> =>
chunk.type === 'text-delta'
)
.map(chunk => chunk.textDelta)
.join('');
const retry = async (
action: string,
t: ExecutionContext<Tester>,
@@ -445,6 +458,49 @@ The term **“CRDT”** was first introduced by Marc Shapiro, Nuno Preguiça, Ca
},
type: 'object' as const,
},
{
name: 'Gemini native text',
promptName: ['Chat With AFFiNE AI'],
messages: [
{
role: 'user' as const,
content:
'In one short sentence, explain what AFFiNE AI is and mention AFFiNE by name.',
},
],
config: { model: 'gemini-2.5-flash' },
verifier: (t: ExecutionContext<Tester>, result: string) => {
assertNotWrappedInCodeBlock(t, result);
t.assert(
result.toLowerCase().includes('affine'),
'should mention AFFiNE'
);
},
prefer: CopilotProviderType.Gemini,
type: 'text' as const,
},
{
name: 'Gemini native stream objects',
promptName: ['Chat With AFFiNE AI'],
messages: [
{
role: 'user' as const,
content:
'Respond with one short sentence about AFFiNE AI and mention AFFiNE by name.',
},
],
config: { model: 'gemini-2.5-flash' },
verifier: (t: ExecutionContext<Tester>, result: string) => {
t.truthy(checkStreamObjects(result), 'should be valid stream objects');
const assembledText = getStreamObjectText(result);
t.assert(
assembledText.toLowerCase().includes('affine'),
'should mention AFFiNE'
);
},
prefer: CopilotProviderType.Gemini,
type: 'object' as const,
},
{
name: 'Should transcribe short audio',
promptName: ['Transcript audio'],
@@ -717,14 +773,13 @@ for (const {
const { factory, prompt: promptService } = t.context;
const prompt = (await promptService.get(promptName))!;
t.truthy(prompt, 'should have prompt');
const provider = (await factory.getProviderByModel(prompt.model, {
const finalConfig = Object.assign({}, prompt.config, config);
const modelId = finalConfig.model || prompt.model;
const provider = (await factory.getProviderByModel(modelId, {
prefer,
}))!;
t.truthy(provider, 'should have provider');
await retry(`action: ${promptName}`, t, async t => {
const finalConfig = Object.assign({}, prompt.config, config);
const modelId = finalConfig.model || prompt.model;
switch (type) {
case 'text': {
const result = await provider.text(
@@ -892,7 +947,7 @@ test(
'should be able to rerank message chunks',
runIfCopilotConfigured,
async t => {
const { factory, prompt } = t.context;
const { factory } = t.context;
await retry('rerank', t, async t => {
const query = 'Is this content relevant to programming?';
@@ -909,14 +964,18 @@ test(
'The stock market is experiencing significant fluctuations.',
];
const p = (await prompt.get('Rerank results'))!;
t.assert(p, 'should have prompt for rerank');
const provider = (await factory.getProviderByModel(p.model))!;
const provider = (await factory.getProviderByModel('gpt-5.2'))!;
t.assert(provider, 'should have provider for rerank');
const scores = await provider.rerank(
{ modelId: p.model },
embeddings.map(e => p.finish({ query, doc: e }))
{ modelId: 'gpt-5.2' },
{
query,
candidates: embeddings.map((text, index) => ({
id: String(index),
text,
})),
}
);
t.is(scores.length, 10, 'should return scores for all chunks');
@@ -931,8 +990,8 @@ test(
t.log('Rerank scores:', scores);
t.is(
scores.filter(s => s > 0.5).length,
4,
'should have 4 related chunks'
5,
'should have 5 related chunks'
);
});
}

View File

@@ -33,10 +33,7 @@ import {
ModelOutputType,
OpenAIProvider,
} from '../../plugins/copilot/providers';
import {
CitationParser,
TextStreamParser,
} from '../../plugins/copilot/providers/utils';
import { TextStreamParser } from '../../plugins/copilot/providers/utils';
import { ChatSessionService } from '../../plugins/copilot/session';
import { CopilotStorage } from '../../plugins/copilot/storage';
import { CopilotTranscriptionService } from '../../plugins/copilot/transcript';
@@ -660,6 +657,55 @@ test('should be able to generate with message id', async t => {
}
});
test('should preserve file handle attachments when merging user content into prompt', async t => {
const { prompt, session } = t.context;
await prompt.set(promptName, 'model', [
{ role: 'user', content: '{{content}}' },
]);
const sessionId = await session.create({
docId: 'test',
workspaceId: 'test',
userId,
promptName,
pinned: false,
});
const s = (await session.get(sessionId))!;
const message = await session.createMessage({
sessionId,
content: 'Summarize this file',
attachments: [
{
kind: 'file_handle',
fileHandle: 'file_123',
mimeType: 'application/pdf',
},
],
});
await s.pushByMessageId(message);
const finalMessages = s.finish({});
t.deepEqual(finalMessages, [
{
role: 'user',
content: 'Summarize this file',
attachments: [
{
kind: 'file_handle',
fileHandle: 'file_123',
mimeType: 'application/pdf',
},
],
params: {
content: 'Summarize this file',
},
},
]);
});
test('should save message correctly', async t => {
const { prompt, session } = t.context;
@@ -1225,149 +1271,6 @@ test('should be able to run image executor', async t => {
Sinon.restore();
});
test('CitationParser should replace citation placeholders with URLs', t => {
const content =
'This is [a] test sentence with [citations [1]] and [[2]] and [3].';
const citations = ['https://example1.com', 'https://example2.com'];
const parser = new CitationParser();
for (const citation of citations) {
parser.push(citation);
}
const result = parser.parse(content) + parser.end();
const expected = [
'This is [a] test sentence with [citations [^1]] and [^2] and [3].',
`[^1]: {"type":"url","url":"${encodeURIComponent(citations[0])}"}`,
`[^2]: {"type":"url","url":"${encodeURIComponent(citations[1])}"}`,
].join('\n');
t.is(result, expected);
});
test('CitationParser should replace chunks of citation placeholders with URLs', t => {
const contents = [
'[[]]',
'This is [',
'a] test sentence ',
'with citations [1',
'] and [',
'[2]] and [[',
'3]] and [[4',
']] and [[5]',
'] and [[6]]',
' and [7',
];
const citations = [
'https://example1.com',
'https://example2.com',
'https://example3.com',
'https://example4.com',
'https://example5.com',
'https://example6.com',
'https://example7.com',
];
const parser = new CitationParser();
for (const citation of citations) {
parser.push(citation);
}
let result = contents.reduce((acc, current) => {
return acc + parser.parse(current);
}, '');
result += parser.end();
const expected = [
'[[]]This is [a] test sentence with citations [^1] and [^2] and [^3] and [^4] and [^5] and [^6] and [7',
`[^1]: {"type":"url","url":"${encodeURIComponent(citations[0])}"}`,
`[^2]: {"type":"url","url":"${encodeURIComponent(citations[1])}"}`,
`[^3]: {"type":"url","url":"${encodeURIComponent(citations[2])}"}`,
`[^4]: {"type":"url","url":"${encodeURIComponent(citations[3])}"}`,
`[^5]: {"type":"url","url":"${encodeURIComponent(citations[4])}"}`,
`[^6]: {"type":"url","url":"${encodeURIComponent(citations[5])}"}`,
`[^7]: {"type":"url","url":"${encodeURIComponent(citations[6])}"}`,
].join('\n');
t.is(result, expected);
});
test('CitationParser should not replace citation already with URLs', t => {
const content =
'This is [a] test sentence with citations [1](https://example1.com) and [[2]](https://example2.com) and [[3](https://example3.com)].';
const citations = [
'https://example4.com',
'https://example5.com',
'https://example6.com',
];
const parser = new CitationParser();
for (const citation of citations) {
parser.push(citation);
}
const result = parser.parse(content) + parser.end();
const expected = [
content,
`[^1]: {"type":"url","url":"${encodeURIComponent(citations[0])}"}`,
`[^2]: {"type":"url","url":"${encodeURIComponent(citations[1])}"}`,
`[^3]: {"type":"url","url":"${encodeURIComponent(citations[2])}"}`,
].join('\n');
t.is(result, expected);
});
test('CitationParser should not replace chunks of citation already with URLs', t => {
const contents = [
'This is [a] test sentence with citations [1',
'](https://example1.com) and [[2]',
'](https://example2.com) and [[3](https://example3.com)].',
];
const citations = [
'https://example4.com',
'https://example5.com',
'https://example6.com',
];
const parser = new CitationParser();
for (const citation of citations) {
parser.push(citation);
}
let result = contents.reduce((acc, current) => {
return acc + parser.parse(current);
}, '');
result += parser.end();
const expected = [
contents.join(''),
`[^1]: {"type":"url","url":"${encodeURIComponent(citations[0])}"}`,
`[^2]: {"type":"url","url":"${encodeURIComponent(citations[1])}"}`,
`[^3]: {"type":"url","url":"${encodeURIComponent(citations[2])}"}`,
].join('\n');
t.is(result, expected);
});
test('CitationParser should replace openai style reference chunks', t => {
const contents = [
'This is [a] test sentence with citations ',
'([example1.com](https://example1.com))',
];
const parser = new CitationParser();
let result = contents.reduce((acc, current) => {
return acc + parser.parse(current);
}, '');
result += parser.end();
const expected = [
contents[0] + '[^1]',
`[^1]: {"type":"url","url":"${encodeURIComponent('https://example1.com')}"}`,
].join('\n');
t.is(result, expected);
});
test('TextStreamParser should format different types of chunks correctly', t => {
// Define interfaces for fixtures
interface BaseFixture {

View File

@@ -1,210 +0,0 @@
import test from 'ava';
import { z } from 'zod';
import type { NativeLlmRequest, NativeLlmStreamEvent } from '../../native';
import {
buildNativeRequest,
NativeProviderAdapter,
} from '../../plugins/copilot/providers/native';
const mockDispatch = () =>
(async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
yield { type: 'text_delta', text: 'Use [^1] now' };
yield { type: 'citation', index: 1, url: 'https://affine.pro' };
yield { type: 'done', finish_reason: 'stop' };
})();
test('NativeProviderAdapter streamText should append citation footnotes', async t => {
const adapter = new NativeProviderAdapter(mockDispatch, {}, 3);
const chunks: string[] = [];
for await (const chunk of adapter.streamText({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
const text = chunks.join('');
t.true(text.includes('Use [^1] now'));
t.true(
text.includes('[^1]: {"type":"url","url":"https%3A%2F%2Faffine.pro"}')
);
});
test('NativeProviderAdapter streamObject should append citation footnotes', async t => {
const adapter = new NativeProviderAdapter(mockDispatch, {}, 3);
const chunks = [];
for await (const chunk of adapter.streamObject({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
t.deepEqual(
chunks.map(chunk => chunk.type),
['text-delta', 'text-delta']
);
const text = chunks
.filter(chunk => chunk.type === 'text-delta')
.map(chunk => chunk.textDelta)
.join('');
t.true(text.includes('Use [^1] now'));
t.true(
text.includes('[^1]: {"type":"url","url":"https%3A%2F%2Faffine.pro"}')
);
});
test('NativeProviderAdapter streamObject should append fallback attachment footnotes', async t => {
const dispatch = () =>
(async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
yield {
type: 'tool_result',
call_id: 'call_1',
name: 'blob_read',
arguments: { blob_id: 'blob_1' },
output: {
blobId: 'blob_1',
fileName: 'a.txt',
fileType: 'text/plain',
content: 'A',
},
};
yield {
type: 'tool_result',
call_id: 'call_2',
name: 'blob_read',
arguments: { blob_id: 'blob_2' },
output: {
blobId: 'blob_2',
fileName: 'b.txt',
fileType: 'text/plain',
content: 'B',
},
};
yield { type: 'text_delta', text: 'Answer from files.' };
yield { type: 'done', finish_reason: 'stop' };
})();
const adapter = new NativeProviderAdapter(dispatch, {}, 3);
const chunks = [];
for await (const chunk of adapter.streamObject({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
const text = chunks
.filter(chunk => chunk.type === 'text-delta')
.map(chunk => chunk.textDelta)
.join('');
t.true(text.includes('Answer from files.'));
t.true(text.includes('[^1][^2]'));
t.true(
text.includes(
'[^1]: {"type":"attachment","blobId":"blob_1","fileName":"a.txt","fileType":"text/plain"}'
)
);
t.true(
text.includes(
'[^2]: {"type":"attachment","blobId":"blob_2","fileName":"b.txt","fileType":"text/plain"}'
)
);
});
test('NativeProviderAdapter streamObject should map tool and text events', async t => {
let round = 0;
const dispatch = (_request: NativeLlmRequest) =>
(async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
round += 1;
if (round === 1) {
yield {
type: 'tool_call',
call_id: 'call_1',
name: 'doc_read',
arguments: { doc_id: 'a1' },
};
yield { type: 'done', finish_reason: 'tool_calls' };
return;
}
yield { type: 'text_delta', text: 'ok' };
yield { type: 'done', finish_reason: 'stop' };
})();
const adapter = new NativeProviderAdapter(
dispatch,
{
doc_read: {
inputSchema: z.object({ doc_id: z.string() }),
execute: async () => ({ markdown: '# a1' }),
},
},
4
);
const events = [];
for await (const event of adapter.streamObject({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'read' }] }],
})) {
events.push(event);
}
t.deepEqual(
events.map(event => event.type),
['tool-call', 'tool-result', 'text-delta']
);
t.deepEqual(events[0], {
type: 'tool-call',
toolCallId: 'call_1',
toolName: 'doc_read',
args: { doc_id: 'a1' },
});
});
test('buildNativeRequest should include rust middleware from profile', async t => {
const { request } = await buildNativeRequest({
model: 'gpt-4.1',
messages: [{ role: 'user', content: 'hello' }],
tools: {},
middleware: {
rust: {
request: ['normalize_messages', 'clamp_max_tokens'],
stream: ['stream_event_normalize', 'citation_indexing'],
},
node: {
text: ['callout'],
},
},
});
t.deepEqual(request.middleware, {
request: ['normalize_messages', 'clamp_max_tokens'],
stream: ['stream_event_normalize', 'citation_indexing'],
});
});
test('NativeProviderAdapter streamText should skip citation footnotes when disabled', async t => {
const adapter = new NativeProviderAdapter(mockDispatch, {}, 3, {
nodeTextMiddleware: ['callout'],
});
const chunks: string[] = [];
for await (const chunk of adapter.streamText({
model: 'gpt-4.1',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'hi' }] }],
})) {
chunks.push(chunk);
}
const text = chunks.join('');
t.true(text.includes('Use [^1] now'));
t.false(
text.includes('[^1]: {"type":"url","url":"https%3A%2F%2Faffine.pro"}')
);
});

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,12 @@
import serverNativeModule from '@affine/server-native';
import test from 'ava';
import type { NativeLlmRerankRequest } from '../../native';
import { ProviderMiddlewareConfig } from '../../plugins/copilot/config';
import {
normalizeOpenAIOptionsForModel,
OpenAIProvider,
} from '../../plugins/copilot/providers/openai';
import { CopilotProvider } from '../../plugins/copilot/providers/provider';
import {
CopilotProviderType,
@@ -12,7 +18,7 @@ class TestOpenAIProvider extends CopilotProvider<{ apiKey: string }> {
readonly type = CopilotProviderType.OpenAI;
readonly models = [
{
id: 'gpt-4.1',
id: 'gpt-5-mini',
capabilities: [
{
input: [ModelInputType.Text],
@@ -36,7 +42,7 @@ class TestOpenAIProvider extends CopilotProvider<{ apiKey: string }> {
}
exposeMetricLabels() {
return this.metricLabels('gpt-4.1');
return this.metricLabels('gpt-5-mini');
}
exposeMiddleware() {
@@ -44,6 +50,33 @@ class TestOpenAIProvider extends CopilotProvider<{ apiKey: string }> {
}
}
class NativeRerankProtocolProvider extends OpenAIProvider {
override readonly models = [
{
id: 'gpt-5.2',
capabilities: [
{
input: [ModelInputType.Text],
output: [ModelOutputType.Text, ModelOutputType.Rerank],
defaultForOutputType: true,
},
],
},
];
override get config() {
return {
apiKey: 'test-key',
baseURL: 'https://api.openai.com/v1',
oldApiStyle: false,
};
}
override configured() {
return true;
}
}
function createProvider(profileMiddleware?: ProviderMiddlewareConfig) {
const provider = new TestOpenAIProvider();
(provider as any).AFFiNEConfig = {
@@ -97,3 +130,71 @@ test('getActiveProviderMiddleware should merge defaults with profile override',
'thinking_format',
]);
});
test('normalizeOpenAIOptionsForModel should drop sampling knobs for gpt-5.2', t => {
t.deepEqual(
normalizeOpenAIOptionsForModel(
{
temperature: 0.7,
topP: 0.8,
presencePenalty: 0.2,
frequencyPenalty: 0.1,
maxTokens: 128,
},
'gpt-5.4'
),
{ maxTokens: 128 }
);
});
test('normalizeOpenAIOptionsForModel should keep options for gpt-4.1', t => {
t.deepEqual(
normalizeOpenAIOptionsForModel(
{ temperature: 0.7, topP: 0.8, maxTokens: 128 },
'gpt-4.1'
),
{ temperature: 0.7, topP: 0.8, maxTokens: 128 }
);
});
test('OpenAI rerank should always use chat-completions native protocol', async t => {
const provider = new NativeRerankProtocolProvider();
let capturedProtocol: string | undefined;
let capturedRequest: NativeLlmRerankRequest | undefined;
const original = (serverNativeModule as any).llmRerankDispatch;
(serverNativeModule as any).llmRerankDispatch = (
protocol: string,
_backendConfigJson: string,
requestJson: string
) => {
capturedProtocol = protocol;
capturedRequest = JSON.parse(requestJson) as NativeLlmRerankRequest;
return JSON.stringify({ model: 'gpt-5.2', scores: [0.9, 0.1] });
};
t.teardown(() => {
(serverNativeModule as any).llmRerankDispatch = original;
});
const scores = await provider.rerank(
{ modelId: 'gpt-5.2' },
{
query: 'programming',
candidates: [
{ id: 'react', text: 'React is a UI library.' },
{ id: 'weather', text: 'The weather is sunny today.' },
],
}
);
t.deepEqual(scores, [0.9, 0.1]);
t.is(capturedProtocol, 'openai_chat');
t.deepEqual(capturedRequest, {
model: 'gpt-5.2',
query: 'programming',
candidates: [
{ id: 'react', text: 'React is a UI library.' },
{ id: 'weather', text: 'The weather is sunny today.' },
],
});
});

View File

@@ -88,11 +88,11 @@ test('resolveModel should support explicit provider prefix and keep slash models
const prefixed = resolveModel({
registry,
modelId: 'openai-main/gpt-4.1',
modelId: 'openai-main/gpt-5-mini',
});
t.deepEqual(prefixed, {
rawModelId: 'openai-main/gpt-4.1',
modelId: 'gpt-4.1',
rawModelId: 'openai-main/gpt-5-mini',
modelId: 'gpt-5-mini',
explicitProviderId: 'openai-main',
candidateProviderIds: ['openai-main'],
});
@@ -154,12 +154,15 @@ test('stripProviderPrefix should only strip matched provider prefix', t => {
});
t.is(
stripProviderPrefix(registry, 'openai-main', 'openai-main/gpt-4.1'),
'gpt-4.1'
stripProviderPrefix(registry, 'openai-main', 'openai-main/gpt-5-mini'),
'gpt-5-mini'
);
t.is(
stripProviderPrefix(registry, 'openai-main', 'another-main/gpt-4.1'),
'another-main/gpt-4.1'
stripProviderPrefix(registry, 'openai-main', 'another-main/gpt-5-mini'),
'another-main/gpt-5-mini'
);
t.is(
stripProviderPrefix(registry, 'openai-main', 'gpt-5-mini'),
'gpt-5-mini'
);
t.is(stripProviderPrefix(registry, 'openai-main', 'gpt-4.1'), 'gpt-4.1');
});

View File

@@ -34,6 +34,56 @@ test('ToolCallAccumulator should merge deltas and complete tool call', t => {
id: 'call_1',
name: 'doc_read',
args: { doc_id: 'a1' },
rawArgumentsText: '{"doc_id":"a1"}',
thought: undefined,
});
});
test('ToolCallAccumulator should preserve invalid JSON instead of swallowing it', t => {
const accumulator = new ToolCallAccumulator();
accumulator.feedDelta({
type: 'tool_call_delta',
call_id: 'call_1',
name: 'doc_read',
arguments_delta: '{"doc_id":',
});
const pending = accumulator.drainPending();
t.is(pending.length, 1);
t.deepEqual(pending[0]?.id, 'call_1');
t.deepEqual(pending[0]?.name, 'doc_read');
t.deepEqual(pending[0]?.args, {});
t.is(pending[0]?.rawArgumentsText, '{"doc_id":');
t.truthy(pending[0]?.argumentParseError);
});
test('ToolCallAccumulator should prefer native canonical tool arguments metadata', t => {
const accumulator = new ToolCallAccumulator();
accumulator.feedDelta({
type: 'tool_call_delta',
call_id: 'call_1',
name: 'doc_read',
arguments_delta: '{"stale":true}',
});
const completed = accumulator.complete({
type: 'tool_call',
call_id: 'call_1',
name: 'doc_read',
arguments: {},
arguments_text: '{"doc_id":"a1"}',
arguments_error: 'invalid json',
});
t.deepEqual(completed, {
id: 'call_1',
name: 'doc_read',
args: {},
rawArgumentsText: '{"doc_id":"a1"}',
argumentParseError: 'invalid json',
thought: undefined,
});
});
@@ -71,6 +121,8 @@ test('ToolSchemaExtractor should convert zod schema to json schema', t => {
test('ToolCallLoop should execute tool call and continue to next round', async t => {
const dispatchRequests: NativeLlmRequest[] = [];
const originalMessages = [{ role: 'user', content: 'read doc' }] as const;
const signal = new AbortController().signal;
const dispatch = (request: NativeLlmRequest) => {
dispatchRequests.push(request);
@@ -100,13 +152,17 @@ test('ToolCallLoop should execute tool call and continue to next round', async t
};
let executedArgs: Record<string, unknown> | null = null;
let executedMessages: unknown;
let executedSignal: AbortSignal | undefined;
const loop = new ToolCallLoop(
dispatch,
{
doc_read: {
inputSchema: z.object({ doc_id: z.string() }),
execute: async args => {
execute: async (args, options) => {
executedArgs = args;
executedMessages = options.messages;
executedSignal = options.signal;
return { markdown: '# doc' };
},
},
@@ -114,21 +170,119 @@ test('ToolCallLoop should execute tool call and continue to next round', async t
4
);
const events: NativeLlmStreamEvent[] = [];
for await (const event of loop.run(
{
model: 'gpt-5-mini',
stream: true,
messages: [
{ role: 'user', content: [{ type: 'text', text: 'read doc' }] },
],
},
signal,
[...originalMessages]
)) {
events.push(event);
}
t.deepEqual(executedArgs, { doc_id: 'a1' });
t.deepEqual(executedMessages, originalMessages);
t.is(executedSignal, signal);
t.true(
dispatchRequests[1]?.messages.some(message => message.role === 'tool')
);
t.deepEqual(dispatchRequests[1]?.messages[1]?.content, [
{
type: 'tool_call',
call_id: 'call_1',
name: 'doc_read',
arguments: { doc_id: 'a1' },
arguments_text: '{"doc_id":"a1"}',
arguments_error: undefined,
thought: undefined,
},
]);
t.deepEqual(dispatchRequests[1]?.messages[2]?.content, [
{
type: 'tool_result',
call_id: 'call_1',
name: 'doc_read',
arguments: { doc_id: 'a1' },
arguments_text: '{"doc_id":"a1"}',
arguments_error: undefined,
output: { markdown: '# doc' },
is_error: undefined,
},
]);
t.deepEqual(
events.map(event => event.type),
['tool_call', 'tool_result', 'text_delta', 'done']
);
});
test('ToolCallLoop should surface invalid JSON as tool error without executing', async t => {
let executed = false;
let round = 0;
const loop = new ToolCallLoop(
request => {
round += 1;
const hasToolResult = request.messages.some(
message => message.role === 'tool'
);
return (async function* (): AsyncIterableIterator<NativeLlmStreamEvent> {
if (!hasToolResult && round === 1) {
yield {
type: 'tool_call_delta',
call_id: 'call_1',
name: 'doc_read',
arguments_delta: '{"doc_id":',
};
yield { type: 'done', finish_reason: 'tool_calls' };
return;
}
yield { type: 'done', finish_reason: 'stop' };
})();
},
{
doc_read: {
inputSchema: z.object({ doc_id: z.string() }),
execute: async () => {
executed = true;
return { markdown: '# doc' };
},
},
},
2
);
const events: NativeLlmStreamEvent[] = [];
for await (const event of loop.run({
model: 'gpt-4.1',
model: 'gpt-5-mini',
stream: true,
messages: [{ role: 'user', content: [{ type: 'text', text: 'read doc' }] }],
})) {
events.push(event);
}
t.deepEqual(executedArgs, { doc_id: 'a1' });
t.true(
dispatchRequests[1]?.messages.some(message => message.role === 'tool')
);
t.deepEqual(
events.map(event => event.type),
['tool_call', 'tool_result', 'text_delta', 'done']
);
t.false(executed);
t.true(events[0]?.type === 'tool_result');
t.deepEqual(events[0], {
type: 'tool_result',
call_id: 'call_1',
name: 'doc_read',
arguments: {},
arguments_text: '{"doc_id":',
arguments_error:
events[0]?.type === 'tool_result' ? events[0].arguments_error : undefined,
output: {
message: 'Invalid tool arguments JSON',
rawArguments: '{"doc_id":',
error:
events[0]?.type === 'tool_result'
? events[0].arguments_error
: undefined,
},
is_error: true,
});
});

View File

@@ -1,12 +1,6 @@
import test from 'ava';
import { z } from 'zod';
import {
chatToGPTMessage,
CitationFootnoteFormatter,
CitationParser,
StreamPatternParser,
} from '../../plugins/copilot/providers/utils';
import { CitationFootnoteFormatter } from '../../plugins/copilot/providers/utils';
test('CitationFootnoteFormatter should format sorted footnotes from citation events', t => {
const formatter = new CitationFootnoteFormatter();
@@ -50,67 +44,3 @@ test('CitationFootnoteFormatter should overwrite duplicated index with latest ur
'[^1]: {"type":"url","url":"https%3A%2F%2Fexample.com%2Fnew"}'
);
});
test('StreamPatternParser should keep state across chunks', t => {
const parser = new StreamPatternParser(pattern => {
if (pattern.kind === 'wrappedLink') {
return `[^${pattern.url}]`;
}
if (pattern.kind === 'index') {
return `[#${pattern.value}]`;
}
return `[${pattern.text}](${pattern.url})`;
});
const first = parser.write('ref ([AFFiNE](https://affine.pro');
const second = parser.write(')) and [2]');
t.is(first, 'ref ');
t.is(second, '[^https://affine.pro] and [#2]');
t.is(parser.end(), '');
});
test('CitationParser should convert wrapped links to numbered footnotes', t => {
const parser = new CitationParser();
const output = parser.parse('Use ([AFFiNE](https://affine.pro)) now');
t.is(output, 'Use [^1] now');
t.regex(
parser.end(),
/\[\^1\]: \{"type":"url","url":"https%3A%2F%2Faffine.pro"\}/
);
});
test('chatToGPTMessage should not mutate input and should keep system schema', async t => {
const schema = z.object({
query: z.string(),
});
const messages = [
{
role: 'system' as const,
content: 'You are helper',
params: { schema },
},
{
role: 'user' as const,
content: '',
attachments: ['https://example.com/a.png'],
},
];
const firstRef = messages[0];
const secondRef = messages[1];
const [system, normalized, parsedSchema] = await chatToGPTMessage(
messages,
false
);
t.is(system, 'You are helper');
t.is(parsedSchema, schema);
t.is(messages.length, 2);
t.is(messages[0], firstRef);
t.is(messages[1], secondRef);
t.deepEqual(normalized[0], {
role: 'user',
content: [{ type: 'text', text: '[no content]' }],
});
});

View File

@@ -33,39 +33,12 @@ export class MockCopilotProvider extends OpenAIProvider {
id: 'test-image',
capabilities: [
{
input: [ModelInputType.Text],
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Image],
defaultForOutputType: true,
},
],
},
{
id: 'gpt-4o',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
],
},
{
id: 'gpt-4o-2024-08-06',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
],
},
{
id: 'gpt-4.1-2025-04-14',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
],
},
{
id: 'gpt-5',
capabilities: [
@@ -97,6 +70,19 @@ export class MockCopilotProvider extends OpenAIProvider {
},
],
},
{
id: 'gpt-5-nano',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
],
},
{
id: 'gpt-image-1',
capabilities: [
@@ -133,6 +119,23 @@ export class MockCopilotProvider extends OpenAIProvider {
},
],
},
{
id: 'gemini-3.1-pro-preview',
capabilities: [
{
input: [
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
],
},
];
override async text(

View File

@@ -1,3 +1,5 @@
import { Logger } from '@nestjs/common';
import { CleanedTelemetryEvent, Scalar } from './cleaner';
const GA4_ENDPOINT = 'https://www.google-analytics.com/mp/collect';
@@ -14,6 +16,7 @@ type Ga4Payload = {
};
export class Ga4Client {
private readonly logger = new Logger(Ga4Client.name);
constructor(
private readonly measurementId: string,
private readonly apiSecret: string,
@@ -45,10 +48,13 @@ export class Ga4Client {
try {
await this.post(payload);
} catch {
if (env.DEPLOYMENT_TYPE === 'affine') {
if (
env.DEPLOYMENT_TYPE === 'affine' &&
env.NODE_ENV === 'production'
) {
// In production, we want to be resilient to GA4 failures, so we catch and ignore errors.
// In non-production environments, we rethrow to surface issues during development and testing.
console.info(
this.logger.log(
'Failed to send telemetry event to GA4:',
chunk.map(e => e.eventName).join(', ')
);

View File

@@ -10,6 +10,7 @@ import {
CopilotSessionNotFound,
} from '../base';
import { getTokenEncoder } from '../native';
import type { PromptAttachment } from '../plugins/copilot/providers/types';
import { BaseModel } from './base';
export enum SessionType {
@@ -24,7 +25,7 @@ type ChatPrompt = {
model: string;
};
type ChatAttachment = { attachment: string; mimeType: string } | string;
type ChatAttachment = PromptAttachment;
type ChatStreamObject = {
type: 'text-delta' | 'reasoning' | 'tool-call' | 'tool-result';
@@ -173,22 +174,105 @@ export class CopilotSessionModel extends BaseModel {
}
return attachments
.map(attachment =>
typeof attachment === 'string'
? (this.sanitizeString(attachment) ?? '')
: {
attachment:
this.sanitizeString(attachment.attachment) ??
attachment.attachment,
.map(attachment => {
if (typeof attachment === 'string') {
return this.sanitizeString(attachment) ?? '';
}
if ('attachment' in attachment) {
return {
attachment:
this.sanitizeString(attachment.attachment) ??
attachment.attachment,
mimeType:
this.sanitizeString(attachment.mimeType) ?? attachment.mimeType,
};
}
switch (attachment.kind) {
case 'url':
return {
...attachment,
url: this.sanitizeString(attachment.url) ?? attachment.url,
mimeType:
this.sanitizeString(attachment.mimeType) ?? attachment.mimeType,
}
)
fileName:
this.sanitizeString(attachment.fileName) ?? attachment.fileName,
providerHint: attachment.providerHint
? {
provider:
this.sanitizeString(attachment.providerHint.provider) ??
attachment.providerHint.provider,
kind:
this.sanitizeString(attachment.providerHint.kind) ??
attachment.providerHint.kind,
}
: undefined,
};
case 'data':
case 'bytes':
return {
...attachment,
data: this.sanitizeString(attachment.data) ?? attachment.data,
mimeType:
this.sanitizeString(attachment.mimeType) ?? attachment.mimeType,
fileName:
this.sanitizeString(attachment.fileName) ?? attachment.fileName,
providerHint: attachment.providerHint
? {
provider:
this.sanitizeString(attachment.providerHint.provider) ??
attachment.providerHint.provider,
kind:
this.sanitizeString(attachment.providerHint.kind) ??
attachment.providerHint.kind,
}
: undefined,
};
case 'file_handle':
return {
...attachment,
fileHandle:
this.sanitizeString(attachment.fileHandle) ??
attachment.fileHandle,
mimeType:
this.sanitizeString(attachment.mimeType) ?? attachment.mimeType,
fileName:
this.sanitizeString(attachment.fileName) ?? attachment.fileName,
providerHint: attachment.providerHint
? {
provider:
this.sanitizeString(attachment.providerHint.provider) ??
attachment.providerHint.provider,
kind:
this.sanitizeString(attachment.providerHint.kind) ??
attachment.providerHint.kind,
}
: undefined,
};
}
return attachment;
})
.filter(attachment => {
if (typeof attachment === 'string') {
return !!attachment;
}
return !!attachment.attachment && !!attachment.mimeType;
if ('attachment' in attachment) {
return !!attachment.attachment && !!attachment.mimeType;
}
switch (attachment.kind) {
case 'url':
return !!attachment.url;
case 'data':
case 'bytes':
return !!attachment.data && !!attachment.mimeType;
case 'file_handle':
return !!attachment.fileHandle;
}
return false;
});
}

View File

@@ -65,6 +65,21 @@ type NativeLlmModule = {
backendConfigJson: string,
requestJson: string
) => string | Promise<string>;
llmStructuredDispatch?: (
protocol: string,
backendConfigJson: string,
requestJson: string
) => string | Promise<string>;
llmEmbeddingDispatch?: (
protocol: string,
backendConfigJson: string,
requestJson: string
) => string | Promise<string>;
llmRerankDispatch?: (
protocol: string,
backendConfigJson: string,
requestJson: string
) => string | Promise<string>;
llmDispatchStream?: (
protocol: string,
backendConfigJson: string,
@@ -79,12 +94,20 @@ const nativeLlmModule = serverNativeModule as typeof serverNativeModule &
export type NativeLlmProtocol =
| 'openai_chat'
| 'openai_responses'
| 'anthropic';
| 'anthropic'
| 'gemini';
export type NativeLlmBackendConfig = {
base_url: string;
auth_token: string;
request_layer?: 'anthropic' | 'chat_completions' | 'responses' | 'vertex';
request_layer?:
| 'anthropic'
| 'chat_completions'
| 'responses'
| 'vertex'
| 'vertex_anthropic'
| 'gemini_api'
| 'gemini_vertex';
headers?: Record<string, string>;
no_streaming?: boolean;
timeout_ms?: number;
@@ -100,6 +123,8 @@ export type NativeLlmCoreContent =
call_id: string;
name: string;
arguments: Record<string, unknown>;
arguments_text?: string;
arguments_error?: string;
thought?: string;
}
| {
@@ -109,8 +134,12 @@ export type NativeLlmCoreContent =
is_error?: boolean;
name?: string;
arguments?: Record<string, unknown>;
arguments_text?: string;
arguments_error?: string;
}
| { type: 'image'; source: Record<string, unknown> | string };
| { type: 'image'; source: Record<string, unknown> | string }
| { type: 'audio'; source: Record<string, unknown> | string }
| { type: 'file'; source: Record<string, unknown> | string };
export type NativeLlmCoreMessage = {
role: NativeLlmCoreRole;
@@ -133,22 +162,54 @@ export type NativeLlmRequest = {
tool_choice?: 'auto' | 'none' | 'required' | { name: string };
include?: string[];
reasoning?: Record<string, unknown>;
response_schema?: Record<string, unknown>;
middleware?: {
request?: Array<
'normalize_messages' | 'clamp_max_tokens' | 'tool_schema_rewrite'
>;
stream?: Array<'stream_event_normalize' | 'citation_indexing'>;
config?: {
no_additional_properties?: boolean;
drop_property_format?: boolean;
drop_property_min_length?: boolean;
drop_array_min_items?: boolean;
drop_array_max_items?: boolean;
additional_properties_policy?: 'preserve' | 'forbid';
property_format_policy?: 'preserve' | 'drop';
property_min_length_policy?: 'preserve' | 'drop';
array_min_items_policy?: 'preserve' | 'drop';
array_max_items_policy?: 'preserve' | 'drop';
max_tokens_cap?: number;
};
};
};
export type NativeLlmStructuredRequest = {
model: string;
messages: NativeLlmCoreMessage[];
schema: Record<string, unknown>;
max_tokens?: number;
temperature?: number;
reasoning?: Record<string, unknown>;
strict?: boolean;
response_mime_type?: string;
middleware?: NativeLlmRequest['middleware'];
};
export type NativeLlmEmbeddingRequest = {
model: string;
inputs: string[];
dimensions?: number;
task_type?: string;
};
export type NativeLlmRerankCandidate = {
id?: string;
text: string;
};
export type NativeLlmRerankRequest = {
model: string;
query: string;
candidates: NativeLlmRerankCandidate[];
top_n?: number;
};
export type NativeLlmDispatchResponse = {
id: string;
model: string;
@@ -159,10 +220,39 @@ export type NativeLlmDispatchResponse = {
total_tokens: number;
cached_tokens?: number;
};
finish_reason: string;
finish_reason:
| 'stop'
| 'length'
| 'tool_calls'
| 'content_filter'
| 'error'
| string;
reasoning_details?: unknown;
};
export type NativeLlmStructuredResponse = {
id: string;
model: string;
output_text: string;
usage: NativeLlmDispatchResponse['usage'];
finish_reason: NativeLlmDispatchResponse['finish_reason'];
reasoning_details?: unknown;
};
export type NativeLlmEmbeddingResponse = {
model: string;
embeddings: number[][];
usage?: {
prompt_tokens: number;
total_tokens: number;
};
};
export type NativeLlmRerankResponse = {
model: string;
scores: number[];
};
export type NativeLlmStreamEvent =
| { type: 'message_start'; id?: string; model?: string }
| { type: 'text_delta'; text: string }
@@ -178,6 +268,8 @@ export type NativeLlmStreamEvent =
call_id: string;
name: string;
arguments: Record<string, unknown>;
arguments_text?: string;
arguments_error?: string;
thought?: string;
}
| {
@@ -187,6 +279,8 @@ export type NativeLlmStreamEvent =
is_error?: boolean;
name?: string;
arguments?: Record<string, unknown>;
arguments_text?: string;
arguments_error?: string;
}
| { type: 'citation'; index: number; url: string }
| {
@@ -200,7 +294,7 @@ export type NativeLlmStreamEvent =
}
| {
type: 'done';
finish_reason?: string;
finish_reason?: NativeLlmDispatchResponse['finish_reason'];
usage?: {
prompt_tokens: number;
completion_tokens: number;
@@ -228,6 +322,57 @@ export async function llmDispatch(
return JSON.parse(responseText) as NativeLlmDispatchResponse;
}
export async function llmStructuredDispatch(
protocol: NativeLlmProtocol,
backendConfig: NativeLlmBackendConfig,
request: NativeLlmStructuredRequest
): Promise<NativeLlmStructuredResponse> {
if (!nativeLlmModule.llmStructuredDispatch) {
throw new Error('native llm structured dispatch is not available');
}
const response = nativeLlmModule.llmStructuredDispatch(
protocol,
JSON.stringify(backendConfig),
JSON.stringify(request)
);
const responseText = await Promise.resolve(response);
return JSON.parse(responseText) as NativeLlmStructuredResponse;
}
export async function llmEmbeddingDispatch(
protocol: NativeLlmProtocol,
backendConfig: NativeLlmBackendConfig,
request: NativeLlmEmbeddingRequest
): Promise<NativeLlmEmbeddingResponse> {
if (!nativeLlmModule.llmEmbeddingDispatch) {
throw new Error('native llm embedding dispatch is not available');
}
const response = nativeLlmModule.llmEmbeddingDispatch(
protocol,
JSON.stringify(backendConfig),
JSON.stringify(request)
);
const responseText = await Promise.resolve(response);
return JSON.parse(responseText) as NativeLlmEmbeddingResponse;
}
export async function llmRerankDispatch(
protocol: NativeLlmProtocol,
backendConfig: NativeLlmBackendConfig,
request: NativeLlmRerankRequest
): Promise<NativeLlmRerankResponse> {
if (!nativeLlmModule.llmRerankDispatch) {
throw new Error('native llm rerank dispatch is not available');
}
const response = nativeLlmModule.llmRerankDispatch(
protocol,
JSON.stringify(backendConfig),
JSON.stringify(request)
);
const responseText = await Promise.resolve(response);
return JSON.parse(responseText) as NativeLlmRerankResponse;
}
export class NativeStreamAdapter<T> implements AsyncIterableIterator<T> {
readonly #queue: T[] = [];
readonly #waiters: ((result: IteratorResult<T>) => void)[] = [];

View File

@@ -81,7 +81,7 @@ export type CopilotProviderProfile = CopilotProviderProfileCommon &
}[CopilotProviderType];
export type CopilotProviderDefaults = Partial<
Record<ModelOutputType, string>
Record<Exclude<ModelOutputType, ModelOutputType.Rerank>, string>
> & {
fallback?: string;
};
@@ -184,6 +184,7 @@ const CopilotProviderDefaultsShape = z.object({
[ModelOutputType.Object]: z.string().optional(),
[ModelOutputType.Embedding]: z.string().optional(),
[ModelOutputType.Image]: z.string().optional(),
[ModelOutputType.Rerank]: z.string().optional(),
[ModelOutputType.Structured]: z.string().optional(),
fallback: z.string().optional(),
});
@@ -230,9 +231,8 @@ defineModuleConfig('copilot', {
chat: 'gemini-2.5-flash',
embedding: 'gemini-embedding-001',
image: 'gpt-image-1',
rerank: 'gpt-4.1',
coding: 'claude-sonnet-4-5@20250929',
complex_text_generation: 'gpt-4o-2024-08-06',
complex_text_generation: 'gpt-5-mini',
quick_decision_making: 'gpt-5-mini',
quick_text_generation: 'gemini-2.5-flash',
polish_and_summarize: 'gemini-2.5-flash',

View File

@@ -1,21 +1,17 @@
import { Logger } from '@nestjs/common';
import type { ModuleRef } from '@nestjs/core';
import {
Config,
CopilotPromptNotFound,
CopilotProviderNotSupported,
} from '../../../base';
import { Config, CopilotProviderNotSupported } from '../../../base';
import { CopilotFailedToGenerateEmbedding } from '../../../base/error/errors.gen';
import {
ChunkSimilarity,
Embedding,
EMBEDDING_DIMENSIONS,
} from '../../../models';
import { PromptService } from '../prompt/service';
import { CopilotProviderFactory } from '../providers/factory';
import type { CopilotProvider } from '../providers/provider';
import {
type CopilotRerankRequest,
type ModelFullConditions,
ModelInputType,
ModelOutputType,
@@ -23,24 +19,20 @@ import {
import { EmbeddingClient, type ReRankResult } from './types';
const EMBEDDING_MODEL = 'gemini-embedding-001';
const RERANK_PROMPT = 'Rerank results';
const RERANK_MODEL = 'gpt-5.2';
class ProductionEmbeddingClient extends EmbeddingClient {
private readonly logger = new Logger(ProductionEmbeddingClient.name);
constructor(
private readonly config: Config,
private readonly providerFactory: CopilotProviderFactory,
private readonly prompt: PromptService
private readonly providerFactory: CopilotProviderFactory
) {
super();
}
override async configured(): Promise<boolean> {
const embedding = await this.providerFactory.getProvider({
modelId: this.config.copilot?.scenarios?.override_enabled
? this.config.copilot.scenarios.scenarios?.embedding || EMBEDDING_MODEL
: EMBEDDING_MODEL,
modelId: this.getEmbeddingModelId(),
outputType: ModelOutputType.Embedding,
});
const result = Boolean(embedding);
@@ -65,9 +57,15 @@ class ProductionEmbeddingClient extends EmbeddingClient {
return provider;
}
private getEmbeddingModelId() {
return this.config.copilot?.scenarios?.override_enabled
? this.config.copilot.scenarios.scenarios?.embedding || EMBEDDING_MODEL
: EMBEDDING_MODEL;
}
async getEmbeddings(input: string[]): Promise<Embedding[]> {
const provider = await this.getProvider({
modelId: EMBEDDING_MODEL,
modelId: this.getEmbeddingModelId(),
outputType: ModelOutputType.Embedding,
});
this.logger.verbose(
@@ -110,15 +108,22 @@ class ProductionEmbeddingClient extends EmbeddingClient {
): Promise<ReRankResult> {
if (!embeddings.length) return [];
const prompt = await this.prompt.get(RERANK_PROMPT);
if (!prompt) {
throw new CopilotPromptNotFound({ name: RERANK_PROMPT });
}
const provider = await this.getProvider({ modelId: prompt.model });
const provider = await this.getProvider({
modelId: RERANK_MODEL,
outputType: ModelOutputType.Rerank,
});
const rerankRequest: CopilotRerankRequest = {
query,
candidates: embeddings.map((embedding, index) => ({
id: String(index),
text: embedding.content,
})),
};
const ranks = await provider.rerank(
{ modelId: prompt.model },
embeddings.map(e => prompt.finish({ query, doc: e.content })),
{ modelId: RERANK_MODEL },
rerankRequest,
{ signal }
);
@@ -171,7 +176,7 @@ class ProductionEmbeddingClient extends EmbeddingClient {
);
try {
// 4.1 mini's context windows large enough to handle all embeddings
// The rerank prompt is expected to handle the full deduped candidate list.
const ranks = await this.getEmbeddingRelevance(
query,
sortedEmbeddings,
@@ -217,9 +222,7 @@ export async function getEmbeddingClient(
const providerFactory = moduleRef.get(CopilotProviderFactory, {
strict: false,
});
const prompt = moduleRef.get(PromptService, { strict: false });
const client = new ProductionEmbeddingClient(config, providerFactory, prompt);
const client = new ProductionEmbeddingClient(config, providerFactory);
if (await client.configured()) {
EMBEDDING_CLIENT = client;
}

View File

@@ -34,7 +34,6 @@ export const Scenario = {
'Remove background',
'Upscale image',
],
rerank: ['Rerank results'],
coding: [
'Apply Updates',
'Code Artifact',
@@ -124,7 +123,7 @@ const workflows: Prompt[] = [
{
name: 'workflow:presentation:step2',
action: 'workflow:presentation:step2',
model: 'gpt-4o-2024-08-06',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -143,7 +142,7 @@ const workflows: Prompt[] = [
{
name: 'workflow:presentation:step4',
action: 'workflow:presentation:step4',
model: 'gpt-4o-2024-08-06',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -187,7 +186,7 @@ const workflows: Prompt[] = [
{
name: 'workflow:brainstorm:step2',
action: 'workflow:brainstorm:step2',
model: 'gpt-4o-2024-08-06',
model: 'gpt-5-mini',
config: {
frequencyPenalty: 0.5,
presencePenalty: 0.5,
@@ -197,7 +196,8 @@ const workflows: Prompt[] = [
messages: [
{
role: 'system',
content: `You are the creator of the mind map. You need to analyze and expand on the input and output it according to the indentation formatting template given below without redundancy.\nBelow is an example of indentation for a mind map, the title and content needs to be removed by text replacement and not retained. Please strictly adhere to the hierarchical indentation of the template and my requirements, bold, headings and other formatting (e.g. #, **) are not allowed, a maximum of five levels of indentation is allowed, and the last node of each node should make a judgment on whether to make a detailed statement or not based on the topic:\nexmaple:\n- {topic}\n - {Level 1}\n - {Level 2}\n - {Level 3}\n - {Level 4}\n - {Level 1}\n - {Level 2}\n - {Level 3}\n - {Level 1}\n - {Level 2}\n - {Level 3}`,
content:
'Use the Markdown nested unordered list syntax without any extra styles or plain text descriptions to analyze and expand the input into a mind map. Regardless of the content, the first-level list should contain only one item, which acts as the root. Each node label must be plain text only. Do not output markdown links, footnotes, citations, URLs, headings, bold text, code fences, or any explanatory text outside the nested list. A maximum of five levels of indentation is allowed.',
},
{
role: 'assistant',
@@ -381,7 +381,11 @@ const textActions: Prompt[] = [
name: 'Transcript audio',
action: 'Transcript audio',
model: 'gemini-2.5-flash',
optionalModels: ['gemini-2.5-flash', 'gemini-2.5-pro'],
optionalModels: [
'gemini-2.5-flash',
'gemini-2.5-pro',
'gemini-3.1-pro-preview',
],
messages: [
{
role: 'system',
@@ -414,25 +418,10 @@ Convert a multi-speaker audio recording into a structured JSON format by transcr
maxRetries: 1,
},
},
{
name: 'Rerank results',
action: 'Rerank results',
model: 'gpt-4.1',
messages: [
{
role: 'system',
content: `Judge whether the Document meets the requirements based on the Query and the Instruct provided. The answer must be "yes" or "no".`,
},
{
role: 'user',
content: `<Instruct>: Given a document search result, determine whether the result is relevant to the query.\n<Query>: {{query}}\n<Document>: {{doc}}`,
},
],
},
{
name: 'Generate a caption',
action: 'Generate a caption',
model: 'gpt-5-mini',
model: 'gemini-2.5-flash',
messages: [
{
role: 'user',
@@ -448,7 +437,7 @@ Convert a multi-speaker audio recording into a structured JSON format by transcr
{
name: 'Conversation Summary',
action: 'Conversation Summary',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -473,7 +462,7 @@ Return only the summary text—no headings, labels, or commentary.`,
{
name: 'Summary',
action: 'Summary',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -504,7 +493,7 @@ You are an assistant helping summarize a document. Use this format, replacing te
{
name: 'Summary as title',
action: 'Summary as title',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -521,7 +510,7 @@ You are an assistant helping summarize a document. Use this format, replacing te
{
name: 'Summary the webpage',
action: 'Summary the webpage',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'user',
@@ -533,7 +522,7 @@ You are an assistant helping summarize a document. Use this format, replacing te
{
name: 'Explain this',
action: 'Explain this',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -576,7 +565,7 @@ A concise paragraph that captures the article's main argument and key conclusion
{
name: 'Explain this image',
action: 'Explain this image',
model: 'gpt-4.1-2025-04-14',
model: 'gemini-2.5-flash',
messages: [
{
role: 'system',
@@ -727,7 +716,7 @@ You are a highly accomplished professional translator, demonstrating profound pr
{
name: 'Summarize the meeting',
action: 'Summarize the meeting',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -752,7 +741,7 @@ You are an assistant helping summarize a document. Use this format, replacing te
{
name: 'Find action for summary',
action: 'Find action for summary',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -774,7 +763,7 @@ You are an assistant helping find actions of meeting summary. Use this format, r
{
name: 'Write an article about this',
action: 'Write an article about this',
model: 'gemini-2.5-flash',
model: 'gemini-2.5-pro',
messages: [
{
role: 'system',
@@ -829,7 +818,7 @@ You are an assistant helping find actions of meeting summary. Use this format, r
{
name: 'Write a twitter about this',
action: 'Write a twitter about this',
model: 'gpt-4.1-2025-04-14',
model: 'gemini-2.5-flash',
messages: [
{
role: 'system',
@@ -915,7 +904,7 @@ You are an assistant helping find actions of meeting summary. Use this format, r
{
name: 'Write a blog post about this',
action: 'Write a blog post about this',
model: 'gemini-2.5-flash',
model: 'gemini-2.5-pro',
messages: [
{
role: 'system',
@@ -1005,7 +994,7 @@ You are an assistant helping find actions of meeting summary. Use this format, r
{
name: 'Change tone to',
action: 'Change tone',
model: 'gpt-4.1-2025-04-14',
model: 'gemini-2.5-flash',
messages: [
{
role: 'system',
@@ -1096,12 +1085,12 @@ You are an assistant helping find actions of meeting summary. Use this format, r
{
name: 'Brainstorm mindmap',
action: 'Brainstorm mindmap',
model: 'gpt-4o-2024-08-06',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
content:
'Use the Markdown nested unordered list syntax without any extra styles or plain text descriptions to brainstorm the questions or topics provided by user for a mind map. Regardless of the content, the first-level list should contain only one item, which acts as the root. Do not wrap everything into a single code block.',
'Use the Markdown nested unordered list syntax without any extra styles or plain text descriptions to brainstorm the questions or topics provided by user for a mind map. Regardless of the content, the first-level list should contain only one item, which acts as the root. Each node label must be plain text only. Do not output markdown links, footnotes, citations, URLs, headings, bold text, code fences, or any explanatory text outside the nested list.',
},
{
role: 'user',
@@ -1113,12 +1102,12 @@ You are an assistant helping find actions of meeting summary. Use this format, r
{
name: 'Expand mind map',
action: 'Expand mind map',
model: 'gpt-4o-2024-08-06',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
content:
'You are a professional writer. Use the Markdown nested unordered list syntax without any extra styles or plain text descriptions to brainstorm the questions or topics provided by user for a mind map.',
'You are a professional writer. Use the Markdown nested unordered list syntax without any extra styles or plain text descriptions to expand the selected node in a mind map. The output must be exactly one subtree: the first bullet must repeat the selected node text as the subtree root, and it must include at least one new nested child bullet beneath it. Each node label must be plain text only. Do not output markdown links, footnotes, citations, URLs, headings, bold text, code fences, or any explanatory text outside the nested list.',
},
{
role: 'user',
@@ -1190,7 +1179,7 @@ The output must be perfect. Adherence to every detail of these instructions is n
{
name: 'Improve grammar for it',
action: 'Improve grammar for it',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -1259,7 +1248,7 @@ The output must be perfect. Adherence to every detail of these instructions is n
{
name: 'Find action items from it',
action: 'Find action items from it',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -1283,7 +1272,7 @@ If there are items in the content that can be used as to-do tasks, please refer
{
name: 'Check code error',
action: 'Check code error',
model: 'gpt-4.1-2025-04-14',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -1343,7 +1332,7 @@ If there are items in the content that can be used as to-do tasks, please refer
{
name: 'Create a presentation',
action: 'Create a presentation',
model: 'gpt-4o-2024-08-06',
model: 'gpt-5-mini',
messages: [
{
role: 'system',
@@ -1518,7 +1507,7 @@ When sent new notes, respond ONLY with the contents of the html file.`,
{
name: 'Continue writing',
action: 'Continue writing',
model: 'gemini-2.5-flash',
model: 'gemini-2.5-pro',
messages: [
{
role: 'system',
@@ -1904,6 +1893,7 @@ const CHAT_PROMPT: Omit<Prompt, 'name'> = {
optionalModels: [
'gemini-2.5-flash',
'gemini-2.5-pro',
'gemini-3.1-pro-preview',
'claude-sonnet-4-5@20250929',
],
messages: [
@@ -2074,7 +2064,11 @@ Below is the user's query. Please respond in the user's preferred language witho
'codeArtifact',
'blobRead',
],
proModels: ['gemini-2.5-pro', 'claude-sonnet-4-5@20250929'],
proModels: [
'gemini-2.5-pro',
'gemini-3.1-pro-preview',
'claude-sonnet-4-5@20250929',
],
},
};

View File

@@ -1,5 +1,3 @@
import type { ToolSet } from 'ai';
import {
CopilotProviderSideError,
metrics,
@@ -11,6 +9,7 @@ import {
type NativeLlmRequest,
} from '../../../../native';
import type { NodeTextMiddleware } from '../../config';
import type { CopilotToolSet } from '../../tools';
import { buildNativeRequest, NativeProviderAdapter } from '../native';
import { CopilotProvider } from '../provider';
import type {
@@ -20,7 +19,11 @@ import type {
StreamObject,
} from '../types';
import { CopilotProviderType, ModelOutputType } from '../types';
import { getGoogleAuth, getVertexAnthropicBaseUrl } from '../utils';
import {
getGoogleAuth,
getVertexAnthropicBaseUrl,
type VertexAnthropicProviderConfig,
} from '../utils';
export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
private handleError(e: any) {
@@ -36,22 +39,16 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
private async createNativeConfig(): Promise<NativeLlmBackendConfig> {
if (this.type === CopilotProviderType.AnthropicVertex) {
const auth = await getGoogleAuth(this.config as any, 'anthropic');
const headers = auth.headers();
const authorization =
headers.Authorization ||
(headers as Record<string, string | undefined>).authorization;
const token =
typeof authorization === 'string'
? authorization.replace(/^Bearer\s+/i, '')
: '';
const baseUrl =
getVertexAnthropicBaseUrl(this.config as any) || auth.baseUrl;
const config = this.config as VertexAnthropicProviderConfig;
const auth = await getGoogleAuth(config, 'anthropic');
const { Authorization: authHeader } = auth.headers();
const token = authHeader.replace(/^Bearer\s+/i, '');
const baseUrl = getVertexAnthropicBaseUrl(config) || auth.baseUrl;
return {
base_url: baseUrl || '',
auth_token: token,
request_layer: 'vertex',
headers,
request_layer: 'vertex_anthropic',
headers: { Authorization: authHeader },
};
}
@@ -65,7 +62,7 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
private createAdapter(
backendConfig: NativeLlmBackendConfig,
tools: ToolSet,
tools: CopilotToolSet,
nodeTextMiddleware?: NodeTextMiddleware[]
) {
return new NativeProviderAdapter(
@@ -93,8 +90,12 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
options: CopilotChatOptions = {}
): Promise<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
@@ -102,11 +103,13 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const reasoning = this.getReasoning(options, model.id);
const cap = this.getAttachCapability(model, ModelOutputType.Text);
const { request } = await buildNativeRequest({
model: model.id,
messages,
options,
tools,
attachmentCapability: cap,
reasoning,
middleware,
});
@@ -115,7 +118,7 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
tools,
middleware.node?.text
);
return await adapter.text(request, options.signal);
return await adapter.text(request, options.signal, messages);
} catch (e: any) {
metrics.ai
.counter('chat_text_errors')
@@ -130,8 +133,12 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
options: CopilotChatOptions = {}
): AsyncIterable<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
@@ -140,11 +147,13 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
const backendConfig = await this.createNativeConfig();
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Text);
const { request } = await buildNativeRequest({
model: model.id,
messages,
options,
tools,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
middleware,
});
@@ -153,7 +162,11 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
tools,
middleware.node?.text
);
for await (const chunk of adapter.streamText(request, options.signal)) {
for await (const chunk of adapter.streamText(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {
@@ -170,8 +183,12 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
options: CopilotChatOptions = {}
): AsyncIterable<StreamObject> {
const fullCond = { ...cond, outputType: ModelOutputType.Object };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
@@ -180,11 +197,13 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
const backendConfig = await this.createNativeConfig();
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Object);
const { request } = await buildNativeRequest({
model: model.id,
messages,
options,
tools,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
middleware,
});
@@ -193,7 +212,11 @@ export abstract class AnthropicProvider<T> extends CopilotProvider<T> {
tools,
middleware.node?.text
);
for await (const chunk of adapter.streamObject(request, options.signal)) {
for await (const chunk of adapter.streamObject(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {

View File

@@ -1,5 +1,6 @@
import z from 'zod';
import { IMAGE_ATTACHMENT_CAPABILITY } from '../attachments';
import { CopilotProviderType, ModelInputType, ModelOutputType } from '../types';
import { AnthropicProvider } from './anthropic';
@@ -23,6 +24,7 @@ export class AnthropicOfficialProvider extends AnthropicProvider<AnthropicOffici
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
attachments: IMAGE_ATTACHMENT_CAPABILITY,
},
],
},
@@ -33,6 +35,7 @@ export class AnthropicOfficialProvider extends AnthropicProvider<AnthropicOffici
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
attachments: IMAGE_ATTACHMENT_CAPABILITY,
},
],
},
@@ -43,6 +46,7 @@ export class AnthropicOfficialProvider extends AnthropicProvider<AnthropicOffici
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
attachments: IMAGE_ATTACHMENT_CAPABILITY,
},
],
},

View File

@@ -1,18 +1,14 @@
import {
createVertexAnthropic,
type GoogleVertexAnthropicProvider,
type GoogleVertexAnthropicProviderSettings,
} from '@ai-sdk/google-vertex/anthropic';
import { IMAGE_ATTACHMENT_CAPABILITY } from '../attachments';
import { CopilotProviderType, ModelInputType, ModelOutputType } from '../types';
import {
getGoogleAuth,
getVertexAnthropicBaseUrl,
VertexModelListSchema,
type VertexProviderConfig,
} from '../utils';
import { AnthropicProvider } from './anthropic';
export type AnthropicVertexConfig = GoogleVertexAnthropicProviderSettings;
export type AnthropicVertexConfig = VertexProviderConfig;
export class AnthropicVertexProvider extends AnthropicProvider<AnthropicVertexConfig> {
override readonly type = CopilotProviderType.AnthropicVertex;
@@ -25,6 +21,7 @@ export class AnthropicVertexProvider extends AnthropicProvider<AnthropicVertexCo
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
attachments: IMAGE_ATTACHMENT_CAPABILITY,
},
],
},
@@ -35,6 +32,7 @@ export class AnthropicVertexProvider extends AnthropicProvider<AnthropicVertexCo
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
attachments: IMAGE_ATTACHMENT_CAPABILITY,
},
],
},
@@ -45,23 +43,17 @@ export class AnthropicVertexProvider extends AnthropicProvider<AnthropicVertexCo
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
attachments: IMAGE_ATTACHMENT_CAPABILITY,
},
],
},
];
protected instance!: GoogleVertexAnthropicProvider;
override configured(): boolean {
if (!this.config.location || !this.config.googleAuthOptions) return false;
return !!this.config.project || !!getVertexAnthropicBaseUrl(this.config);
}
override setup() {
super.setup();
this.instance = createVertexAnthropic(this.config);
}
override async refreshOnlineModels() {
try {
const { baseUrl, headers } = await getGoogleAuth(

View File

@@ -0,0 +1,233 @@
import type {
ModelAttachmentCapability,
PromptAttachment,
PromptAttachmentKind,
PromptAttachmentSourceKind,
PromptMessage,
} from './types';
import { inferMimeType } from './utils';
export const IMAGE_ATTACHMENT_CAPABILITY: ModelAttachmentCapability = {
kinds: ['image'],
sourceKinds: ['url', 'data'],
allowRemoteUrls: true,
};
export const GEMINI_ATTACHMENT_CAPABILITY: ModelAttachmentCapability = {
kinds: ['image', 'audio', 'file'],
sourceKinds: ['url', 'data', 'bytes', 'file_handle'],
allowRemoteUrls: true,
};
export type CanonicalPromptAttachment = {
kind: PromptAttachmentKind;
sourceKind: PromptAttachmentSourceKind;
mediaType?: string;
source: Record<string, unknown>;
isRemote: boolean;
};
function parseDataUrl(url: string) {
if (!url.startsWith('data:')) {
return null;
}
const commaIndex = url.indexOf(',');
if (commaIndex === -1) {
return null;
}
const meta = url.slice(5, commaIndex);
const payload = url.slice(commaIndex + 1);
const parts = meta.split(';');
const mediaType = parts[0] || 'text/plain;charset=US-ASCII';
const isBase64 = parts.includes('base64');
return {
mediaType,
data: isBase64
? payload
: Buffer.from(decodeURIComponent(payload), 'utf8').toString('base64'),
};
}
function attachmentTypeFromMediaType(mediaType: string): PromptAttachmentKind {
if (mediaType.startsWith('image/')) {
return 'image';
}
if (mediaType.startsWith('audio/')) {
return 'audio';
}
return 'file';
}
function attachmentKindFromHintOrMediaType(
hint: PromptAttachmentKind | undefined,
mediaType: string | undefined
): PromptAttachmentKind {
if (hint) return hint;
return attachmentTypeFromMediaType(mediaType || '');
}
function toBase64Data(data: string, encoding: 'base64' | 'utf8' = 'base64') {
return encoding === 'base64'
? data
: Buffer.from(data, 'utf8').toString('base64');
}
function appendAttachMetadata(
source: Record<string, unknown>,
attachment: Exclude<PromptAttachment, string> & Record<string, unknown>
) {
if (attachment.fileName) {
source.file_name = attachment.fileName;
}
if (attachment.providerHint) {
source.provider_hint = attachment.providerHint;
}
return source;
}
export function promptAttachmentHasSource(
attachment: PromptAttachment
): boolean {
if (typeof attachment === 'string') {
return !!attachment.trim();
}
if ('attachment' in attachment) {
return !!attachment.attachment;
}
switch (attachment.kind) {
case 'url':
return !!attachment.url;
case 'data':
case 'bytes':
return !!attachment.data;
case 'file_handle':
return !!attachment.fileHandle;
}
}
export async function canonicalizePromptAttachment(
attachment: PromptAttachment,
message: Pick<PromptMessage, 'params'>
): Promise<CanonicalPromptAttachment> {
const fallbackMimeType =
typeof message.params?.mimetype === 'string'
? message.params.mimetype
: undefined;
if (typeof attachment === 'string') {
const dataUrl = parseDataUrl(attachment);
const mediaType =
fallbackMimeType ??
dataUrl?.mediaType ??
(await inferMimeType(attachment));
const kind = attachmentKindFromHintOrMediaType(undefined, mediaType);
if (dataUrl) {
return {
kind,
sourceKind: 'data',
mediaType,
isRemote: false,
source: {
media_type: mediaType || dataUrl.mediaType,
data: dataUrl.data,
},
};
}
return {
kind,
sourceKind: 'url',
mediaType,
isRemote: /^https?:\/\//.test(attachment),
source: { url: attachment, media_type: mediaType },
};
}
if ('attachment' in attachment) {
return await canonicalizePromptAttachment(
{
kind: 'url',
url: attachment.attachment,
mimeType: attachment.mimeType,
},
message
);
}
if (attachment.kind === 'url') {
const dataUrl = parseDataUrl(attachment.url);
const mediaType =
attachment.mimeType ??
fallbackMimeType ??
dataUrl?.mediaType ??
(await inferMimeType(attachment.url));
const kind = attachmentKindFromHintOrMediaType(
attachment.providerHint?.kind,
mediaType
);
if (dataUrl) {
return {
kind,
sourceKind: 'data',
mediaType,
isRemote: false,
source: appendAttachMetadata(
{ media_type: mediaType || dataUrl.mediaType, data: dataUrl.data },
attachment
),
};
}
return {
kind,
sourceKind: 'url',
mediaType,
isRemote: /^https?:\/\//.test(attachment.url),
source: appendAttachMetadata(
{ url: attachment.url, media_type: mediaType },
attachment
),
};
}
if (attachment.kind === 'data' || attachment.kind === 'bytes') {
return {
kind: attachmentKindFromHintOrMediaType(
attachment.providerHint?.kind,
attachment.mimeType
),
sourceKind: attachment.kind,
mediaType: attachment.mimeType,
isRemote: false,
source: appendAttachMetadata(
{
media_type: attachment.mimeType,
data: toBase64Data(
attachment.data,
attachment.kind === 'data' ? attachment.encoding : 'base64'
),
},
attachment
),
};
}
return {
kind: attachmentKindFromHintOrMediaType(
attachment.providerHint?.kind,
attachment.mimeType
),
sourceKind: 'file_handle',
mediaType: attachment.mimeType,
isRemote: false,
source: appendAttachMetadata(
{ file_handle: attachment.fileHandle, media_type: attachment.mimeType },
attachment
),
};
}

View File

@@ -19,6 +19,7 @@ import type {
PromptMessage,
} from './types';
import { CopilotProviderType, ModelInputType, ModelOutputType } from './types';
import { promptAttachmentMimeType, promptAttachmentToUrl } from './utils';
export type FalConfig = {
apiKey: string;
@@ -183,13 +184,14 @@ export class FalProvider extends CopilotProvider<FalConfig> {
return {
model_name: options.modelName || undefined,
image_url: attachments
?.map(v =>
typeof v === 'string'
? v
: v.mimeType.startsWith('image/')
? v.attachment
: undefined
)
?.map(v => {
const url = promptAttachmentToUrl(v);
const mediaType = promptAttachmentMimeType(
v,
typeof params?.mimetype === 'string' ? params.mimetype : undefined
);
return url && mediaType?.startsWith('image/') ? url : undefined;
})
.find(v => !!v),
prompt: content.trim(),
loras: lora.length ? lora : undefined,

View File

@@ -1,58 +1,94 @@
import type {
GoogleGenerativeAIProvider,
GoogleGenerativeAIProviderOptions,
} from '@ai-sdk/google';
import type { GoogleVertexProvider } from '@ai-sdk/google-vertex';
import {
AISDKError,
embedMany,
generateObject,
generateText,
JSONParseError,
stepCountIs,
streamText,
} from 'ai';
import { setTimeout as delay } from 'node:timers/promises';
import { ZodError } from 'zod';
import {
CopilotPromptInvalid,
CopilotProviderSideError,
metrics,
OneMB,
readResponseBufferWithLimit,
safeFetch,
UserFriendlyError,
} from '../../../../base';
import { sniffMime } from '../../../../base/storage/providers/utils';
import {
llmDispatchStream,
llmEmbeddingDispatch,
llmStructuredDispatch,
type NativeLlmBackendConfig,
type NativeLlmEmbeddingRequest,
type NativeLlmRequest,
type NativeLlmStructuredRequest,
} from '../../../../native';
import type { NodeTextMiddleware } from '../../config';
import type { CopilotToolSet } from '../../tools';
import {
buildNativeEmbeddingRequest,
buildNativeRequest,
buildNativeStructuredRequest,
NativeProviderAdapter,
parseNativeStructuredOutput,
StructuredResponseParseError,
} from '../native';
import { CopilotProvider } from '../provider';
import type {
CopilotChatOptions,
CopilotEmbeddingOptions,
CopilotImageOptions,
CopilotProviderModel,
CopilotStructuredOptions,
ModelConditions,
PromptAttachment,
PromptMessage,
StreamObject,
} from '../types';
import { ModelOutputType } from '../types';
import {
chatToGPTMessage,
StreamObjectParser,
TextStreamParser,
} from '../utils';
import { promptAttachmentMimeType, promptAttachmentToUrl } from '../utils';
export const DEFAULT_DIMENSIONS = 256;
const GEMINI_REMOTE_ATTACHMENT_MAX_BYTES = 64 * OneMB;
const TRUSTED_ATTACHMENT_HOST_SUFFIXES = ['cdn.affine.pro'];
const GEMINI_RETRY_INITIAL_DELAY_MS = 2_000;
function normalizeMimeType(mediaType?: string) {
return mediaType?.split(';', 1)[0]?.trim() || 'application/octet-stream';
}
function isYoutubeUrl(url: URL) {
const hostname = url.hostname.toLowerCase();
if (hostname === 'youtu.be') {
return /^\/[\w-]+$/.test(url.pathname);
}
if (hostname !== 'youtube.com' && hostname !== 'www.youtube.com') {
return false;
}
if (url.pathname !== '/watch') {
return false;
}
return !!url.searchParams.get('v');
}
function isGeminiFileUrl(url: URL, baseUrl: string) {
try {
const base = new URL(baseUrl);
const basePath = base.pathname.replace(/\/+$/, '');
return (
url.origin === base.origin &&
url.pathname.startsWith(`${basePath}/files/`)
);
} catch {
return false;
}
}
export abstract class GeminiProvider<T> extends CopilotProvider<T> {
protected abstract instance:
| GoogleGenerativeAIProvider
| GoogleVertexProvider;
protected abstract createNativeConfig(): Promise<NativeLlmBackendConfig>;
private handleError(e: any) {
if (e instanceof UserFriendlyError) {
return e;
} else if (e instanceof AISDKError) {
this.logger.error('Throw error from ai sdk:', e);
return new CopilotProviderSideError({
provider: this.type,
kind: e.name || 'unknown',
message: e.message,
});
} else {
return new CopilotProviderSideError({
provider: this.type,
@@ -62,37 +98,261 @@ export abstract class GeminiProvider<T> extends CopilotProvider<T> {
}
}
protected createNativeDispatch(backendConfig: NativeLlmBackendConfig) {
return (request: NativeLlmRequest, signal?: AbortSignal) =>
llmDispatchStream('gemini', backendConfig, request, signal);
}
protected createNativeStructuredDispatch(
backendConfig: NativeLlmBackendConfig
) {
return (request: NativeLlmStructuredRequest) =>
llmStructuredDispatch('gemini', backendConfig, request);
}
protected createNativeEmbeddingDispatch(
backendConfig: NativeLlmBackendConfig
) {
return (request: NativeLlmEmbeddingRequest) =>
llmEmbeddingDispatch('gemini', backendConfig, request);
}
protected createNativeAdapter(
backendConfig: NativeLlmBackendConfig,
tools: CopilotToolSet,
nodeTextMiddleware?: NodeTextMiddleware[]
) {
return new NativeProviderAdapter(
this.createNativeDispatch(backendConfig),
tools,
this.MAX_STEPS,
{ nodeTextMiddleware }
);
}
protected async fetchRemoteAttach(url: string, signal?: AbortSignal) {
const parsed = new URL(url);
const response = await safeFetch(
parsed,
{ method: 'GET', signal },
this.buildAttachFetchOptions(parsed)
);
if (!response.ok) {
throw new Error(
`Failed to fetch attachment: ${response.status} ${response.statusText}`
);
}
const buffer = await readResponseBufferWithLimit(
response,
GEMINI_REMOTE_ATTACHMENT_MAX_BYTES
);
const headerMimeType = normalizeMimeType(
response.headers.get('content-type') || ''
);
return {
data: buffer.toString('base64'),
mimeType: normalizeMimeType(sniffMime(buffer, headerMimeType)),
};
}
private buildAttachFetchOptions(url: URL) {
const baseOptions = { timeoutMs: 15_000, maxRedirects: 3 } as const;
if (!env.prod) {
return { ...baseOptions, allowPrivateOrigins: new Set([url.origin]) };
}
const trustedOrigins = new Set<string>();
const protocol = this.AFFiNEConfig.server.https ? 'https:' : 'http:';
const port = this.AFFiNEConfig.server.port;
const isDefaultPort =
(protocol === 'https:' && port === 443) ||
(protocol === 'http:' && port === 80);
const addHostOrigin = (host: string) => {
if (!host) return;
try {
const parsed = new URL(`${protocol}//${host}`);
if (!parsed.port && !isDefaultPort) {
parsed.port = String(port);
}
trustedOrigins.add(parsed.origin);
} catch {
// ignore invalid host config entries
}
};
if (this.AFFiNEConfig.server.externalUrl) {
try {
trustedOrigins.add(
new URL(this.AFFiNEConfig.server.externalUrl).origin
);
} catch {
// ignore invalid external URL
}
}
addHostOrigin(this.AFFiNEConfig.server.host);
for (const host of this.AFFiNEConfig.server.hosts) {
addHostOrigin(host);
}
const hostname = url.hostname.toLowerCase();
const trustedByHost = TRUSTED_ATTACHMENT_HOST_SUFFIXES.some(
suffix => hostname === suffix || hostname.endsWith(`.${suffix}`)
);
if (trustedOrigins.has(url.origin) || trustedByHost) {
return { ...baseOptions, allowPrivateOrigins: new Set([url.origin]) };
}
return baseOptions;
}
private shouldInlineRemoteAttach(url: URL, config: NativeLlmBackendConfig) {
switch (config.request_layer) {
case 'gemini_api':
if (url.protocol !== 'http:' && url.protocol !== 'https:') return false;
return !(isGeminiFileUrl(url, config.base_url) || isYoutubeUrl(url));
case 'gemini_vertex':
return false;
default:
return false;
}
}
private toInlineAttach(
attachment: PromptAttachment,
mimeType: string,
data: string
): PromptAttachment {
if (typeof attachment === 'string' || !('kind' in attachment)) {
return { kind: 'bytes', data, mimeType };
}
if (attachment.kind !== 'url') {
return attachment;
}
return {
kind: 'bytes',
data,
mimeType,
fileName: attachment.fileName,
providerHint: attachment.providerHint,
};
}
protected async prepareMessages(
messages: PromptMessage[],
backendConfig: NativeLlmBackendConfig,
signal?: AbortSignal
): Promise<PromptMessage[]> {
const prepared: PromptMessage[] = [];
for (const message of messages) {
signal?.throwIfAborted();
if (!Array.isArray(message.attachments) || !message.attachments.length) {
prepared.push(message);
continue;
}
const attachments: PromptAttachment[] = [];
let changed = false;
for (const attachment of message.attachments) {
signal?.throwIfAborted();
const rawUrl = promptAttachmentToUrl(attachment);
if (!rawUrl || rawUrl.startsWith('data:')) {
attachments.push(attachment);
continue;
}
let parsed: URL;
try {
parsed = new URL(rawUrl);
} catch {
attachments.push(attachment);
continue;
}
if (!this.shouldInlineRemoteAttach(parsed, backendConfig)) {
attachments.push(attachment);
continue;
}
const declaredMimeType = promptAttachmentMimeType(
attachment,
typeof message.params?.mimetype === 'string'
? message.params.mimetype
: undefined
);
const downloaded = await this.fetchRemoteAttach(rawUrl, signal);
attachments.push(
this.toInlineAttach(
attachment,
declaredMimeType
? normalizeMimeType(declaredMimeType)
: downloaded.mimeType,
downloaded.data
)
);
changed = true;
}
prepared.push(changed ? { ...message, attachments } : message);
}
return prepared;
}
protected async waitForStructuredRetry(
delayMs: number,
signal?: AbortSignal
) {
await delay(delayMs, undefined, signal ? { signal } : undefined);
}
async text(
cond: ModelConditions,
messages: PromptMessage[],
options: CopilotChatOptions = {}
): Promise<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id });
const [system, msgs] = await chatToGPTMessage(messages);
const modelInstance = this.instance(model.id);
const { text } = await generateText({
model: modelInstance,
system,
messages: msgs,
abortSignal: options.signal,
providerOptions: {
google: this.getGeminiOptions(options, model.id),
},
tools: await this.getTools(options, model.id),
stopWhen: stepCountIs(this.MAX_STEPS),
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
const backendConfig = await this.createNativeConfig();
const msg = await this.prepareMessages(
messages,
backendConfig,
options.signal
);
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Text);
const { request } = await buildNativeRequest({
model: model.id,
messages: msg,
options,
tools,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
middleware,
});
if (!text) throw new Error('Failed to generate text');
return text.trim();
const adapter = this.createNativeAdapter(
backendConfig,
tools,
middleware.node?.text
);
return await adapter.text(request, options.signal, messages);
} catch (e: any) {
metrics.ai.counter('chat_text_errors').add(1, { model: model.id });
metrics.ai
.counter('chat_text_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
@@ -100,55 +360,65 @@ export abstract class GeminiProvider<T> extends CopilotProvider<T> {
override async structure(
cond: ModelConditions,
messages: PromptMessage[],
options: CopilotChatOptions = {}
options: CopilotStructuredOptions = {}
): Promise<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Structured };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id });
const [system, msgs, schema] = await chatToGPTMessage(messages);
if (!schema) {
throw new CopilotPromptInvalid('Schema is required');
}
const modelInstance = this.instance(model.id);
const { object } = await generateObject({
model: modelInstance,
system,
messages: msgs,
schema,
providerOptions: {
google: {
thinkingConfig: {
thinkingBudget: -1,
includeThoughts: false,
},
},
},
abortSignal: options.signal,
maxRetries: options.maxRetries || 3,
experimental_repairText: async ({ text, error }) => {
if (error instanceof JSONParseError) {
// strange fixed response, temporarily replace it
const ret = text.replaceAll(/^ny\n/g, ' ').trim();
if (ret.startsWith('```') || ret.endsWith('```')) {
return ret
.replace(/```[\w\s]+\n/g, '')
.replace(/\n```/g, '')
.trim();
}
return ret;
}
return null;
},
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
const backendConfig = await this.createNativeConfig();
const msg = await this.prepareMessages(
messages,
backendConfig,
options.signal
);
const structuredDispatch =
this.createNativeStructuredDispatch(backendConfig);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Structured);
const { request, schema } = await buildNativeStructuredRequest({
model: model.id,
messages: msg,
options,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
responseSchema: options.schema,
middleware,
});
return JSON.stringify(object);
const maxRetries = Math.max(options.maxRetries ?? 3, 0);
for (let attempt = 0; ; attempt++) {
try {
const response = await structuredDispatch(request);
const parsed = parseNativeStructuredOutput(response);
const validated = schema.parse(parsed);
return JSON.stringify(validated);
} catch (error) {
const isParsingError =
error instanceof StructuredResponseParseError ||
error instanceof ZodError;
const retryableError =
isParsingError || !(error instanceof UserFriendlyError);
if (!retryableError || attempt >= maxRetries) {
throw error;
}
if (!isParsingError) {
await this.waitForStructuredRetry(
GEMINI_RETRY_INITIAL_DELAY_MS * 2 ** attempt,
options.signal
);
}
}
}
} catch (e: any) {
metrics.ai.counter('chat_text_errors').add(1, { model: model.id });
metrics.ai
.counter('chat_text_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
@@ -159,29 +429,54 @@ export abstract class GeminiProvider<T> extends CopilotProvider<T> {
options: CopilotChatOptions | CopilotImageOptions = {}
): AsyncIterable<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_stream_calls').add(1, { model: model.id });
const fullStream = await this.getFullStream(model, messages, options);
const parser = new TextStreamParser();
for await (const chunk of fullStream) {
const result = parser.parse(chunk);
yield result;
if (options.signal?.aborted) {
await fullStream.cancel();
break;
}
}
if (!options.signal?.aborted) {
const footnotes = parser.end();
if (footnotes.length) {
yield `\n\n${footnotes}`;
}
metrics.ai
.counter('chat_text_stream_calls')
.add(1, this.metricLabels(model.id));
const backendConfig = await this.createNativeConfig();
const preparedMessages = await this.prepareMessages(
messages,
backendConfig,
options.signal
);
const tools = await this.getTools(
options as CopilotChatOptions,
model.id
);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Text);
const { request } = await buildNativeRequest({
model: model.id,
messages: preparedMessages,
options: options as CopilotChatOptions,
tools,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
middleware,
});
const adapter = this.createNativeAdapter(
backendConfig,
tools,
middleware.node?.text
);
for await (const chunk of adapter.streamText(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {
metrics.ai.counter('chat_text_stream_errors').add(1, { model: model.id });
metrics.ai
.counter('chat_text_stream_errors')
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
@@ -192,29 +487,51 @@ export abstract class GeminiProvider<T> extends CopilotProvider<T> {
options: CopilotChatOptions = {}
): AsyncIterable<StreamObject> {
const fullCond = { ...cond, outputType: ModelOutputType.Object };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
.counter('chat_object_stream_calls')
.add(1, { model: model.id });
const fullStream = await this.getFullStream(model, messages, options);
const parser = new StreamObjectParser();
for await (const chunk of fullStream) {
const result = parser.parse(chunk);
if (result) {
yield result;
}
if (options.signal?.aborted) {
await fullStream.cancel();
break;
}
.add(1, this.metricLabels(model.id));
const backendConfig = await this.createNativeConfig();
const msg = await this.prepareMessages(
messages,
backendConfig,
options.signal
);
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Object);
const { request } = await buildNativeRequest({
model: model.id,
messages: msg,
options,
tools,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
middleware,
});
const adapter = this.createNativeAdapter(
backendConfig,
tools,
middleware.node?.text
);
for await (const chunk of adapter.streamObject(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {
metrics.ai
.counter('chat_object_stream_errors')
.add(1, { model: model.id });
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
@@ -224,77 +541,60 @@ export abstract class GeminiProvider<T> extends CopilotProvider<T> {
messages: string | string[],
options: CopilotEmbeddingOptions = { dimensions: DEFAULT_DIMENSIONS }
): Promise<number[][]> {
messages = Array.isArray(messages) ? messages : [messages];
const values = Array.isArray(messages) ? messages : [messages];
const fullCond = { ...cond, outputType: ModelOutputType.Embedding };
await this.checkParams({ embeddings: messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
embeddings: values,
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
.counter('generate_embedding_calls')
.add(1, { model: model.id });
const modelInstance = this.instance.textEmbeddingModel(model.id);
const embeddings = await Promise.allSettled(
messages.map(m =>
embedMany({
model: modelInstance,
values: [m],
maxRetries: 3,
providerOptions: {
google: {
outputDimensionality: options.dimensions || DEFAULT_DIMENSIONS,
taskType: 'RETRIEVAL_DOCUMENT',
},
},
})
)
.add(1, this.metricLabels(model.id));
const backendConfig = await this.createNativeConfig();
const response = await this.createNativeEmbeddingDispatch(backendConfig)(
buildNativeEmbeddingRequest({
model: model.id,
inputs: values,
dimensions: options.dimensions || DEFAULT_DIMENSIONS,
taskType: 'RETRIEVAL_DOCUMENT',
})
);
return embeddings
.flatMap(e => (e.status === 'fulfilled' ? e.value.embeddings : null))
.filter((v): v is number[] => !!v && Array.isArray(v));
return response.embeddings;
} catch (e: any) {
metrics.ai
.counter('generate_embedding_errors')
.add(1, { model: model.id });
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
private async getFullStream(
model: CopilotProviderModel,
messages: PromptMessage[],
options: CopilotChatOptions = {}
) {
const [system, msgs] = await chatToGPTMessage(messages);
const { fullStream } = streamText({
model: this.instance(model.id),
system,
messages: msgs,
abortSignal: options.signal,
providerOptions: {
google: this.getGeminiOptions(options, model.id),
},
tools: await this.getTools(options, model.id),
stopWhen: stepCountIs(this.MAX_STEPS),
});
return fullStream;
protected getReasoning(
options: CopilotChatOptions | CopilotImageOptions,
model: string
): Record<string, unknown> | undefined {
if (
options &&
'reasoning' in options &&
options.reasoning &&
this.isReasoningModel(model)
) {
return this.isGemini3Model(model)
? { include_thoughts: true, thinking_level: 'high' }
: { include_thoughts: true, thinking_budget: 12000 };
}
return undefined;
}
private getGeminiOptions(options: CopilotChatOptions, model: string) {
const result: GoogleGenerativeAIProviderOptions = {};
if (options?.reasoning && this.isReasoningModel(model)) {
result.thinkingConfig = {
thinkingBudget: 12000,
includeThoughts: true,
};
}
return result;
private isGemini3Model(model: string) {
return model.startsWith('gemini-3');
}
private isReasoningModel(model: string) {
return model.startsWith('gemini-2.5');
return model.startsWith('gemini-2.5') || this.isGemini3Model(model);
}
}

View File

@@ -1,9 +1,7 @@
import {
createGoogleGenerativeAI,
type GoogleGenerativeAIProvider,
} from '@ai-sdk/google';
import z from 'zod';
import type { NativeLlmBackendConfig } from '../../../../native';
import { GEMINI_ATTACHMENT_CAPABILITY } from '../attachments';
import { CopilotProviderType, ModelInputType, ModelOutputType } from '../types';
import { GeminiProvider } from './gemini';
@@ -20,25 +18,6 @@ export class GeminiGenerativeProvider extends GeminiProvider<GeminiGenerativeCon
override readonly type = CopilotProviderType.Gemini;
readonly models = [
{
name: 'Gemini 2.0 Flash',
id: 'gemini-2.0-flash-001',
capabilities: [
{
input: [
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
defaultForOutputType: true,
},
],
},
{
name: 'Gemini 2.5 Flash',
id: 'gemini-2.5-flash',
@@ -48,12 +27,15 @@ export class GeminiGenerativeProvider extends GeminiProvider<GeminiGenerativeCon
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
ModelInputType.File,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
attachments: GEMINI_ATTACHMENT_CAPABILITY,
structuredAttachments: GEMINI_ATTACHMENT_CAPABILITY,
},
],
},
@@ -66,12 +48,36 @@ export class GeminiGenerativeProvider extends GeminiProvider<GeminiGenerativeCon
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
ModelInputType.File,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
attachments: GEMINI_ATTACHMENT_CAPABILITY,
structuredAttachments: GEMINI_ATTACHMENT_CAPABILITY,
},
],
},
{
name: 'Gemini 3.1 Pro Preview',
id: 'gemini-3.1-pro-preview',
capabilities: [
{
input: [
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
ModelInputType.File,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
attachments: GEMINI_ATTACHMENT_CAPABILITY,
structuredAttachments: GEMINI_ATTACHMENT_CAPABILITY,
},
],
},
@@ -87,21 +93,10 @@ export class GeminiGenerativeProvider extends GeminiProvider<GeminiGenerativeCon
],
},
];
protected instance!: GoogleGenerativeAIProvider;
override configured(): boolean {
return !!this.config.apiKey;
}
protected override setup() {
super.setup();
this.instance = createGoogleGenerativeAI({
apiKey: this.config.apiKey,
baseURL: this.config.baseURL,
});
}
override async refreshOnlineModels() {
try {
const baseUrl =
@@ -121,4 +116,15 @@ export class GeminiGenerativeProvider extends GeminiProvider<GeminiGenerativeCon
this.logger.error('Failed to fetch available models', e);
}
}
protected override async createNativeConfig(): Promise<NativeLlmBackendConfig> {
return {
base_url: (
this.config.baseURL ||
'https://generativelanguage.googleapis.com/v1beta'
).replace(/\/$/, ''),
auth_token: this.config.apiKey,
request_layer: 'gemini_api',
};
}
}

View File

@@ -1,14 +1,14 @@
import {
createVertex,
type GoogleVertexProvider,
type GoogleVertexProviderSettings,
} from '@ai-sdk/google-vertex';
import type { NativeLlmBackendConfig } from '../../../../native';
import { GEMINI_ATTACHMENT_CAPABILITY } from '../attachments';
import { CopilotProviderType, ModelInputType, ModelOutputType } from '../types';
import { getGoogleAuth, VertexModelListSchema } from '../utils';
import {
getGoogleAuth,
VertexModelListSchema,
type VertexProviderConfig,
} from '../utils';
import { GeminiProvider } from './gemini';
export type GeminiVertexConfig = GoogleVertexProviderSettings;
export type GeminiVertexConfig = VertexProviderConfig;
export class GeminiVertexProvider extends GeminiProvider<GeminiVertexConfig> {
override readonly type = CopilotProviderType.GeminiVertex;
@@ -23,12 +23,15 @@ export class GeminiVertexProvider extends GeminiProvider<GeminiVertexConfig> {
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
ModelInputType.File,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
attachments: GEMINI_ATTACHMENT_CAPABILITY,
structuredAttachments: GEMINI_ATTACHMENT_CAPABILITY,
},
],
},
@@ -41,12 +44,36 @@ export class GeminiVertexProvider extends GeminiProvider<GeminiVertexConfig> {
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
ModelInputType.File,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
attachments: GEMINI_ATTACHMENT_CAPABILITY,
structuredAttachments: GEMINI_ATTACHMENT_CAPABILITY,
},
],
},
{
name: 'Gemini 3.1 Pro Preview',
id: 'gemini-3.1-pro-preview',
capabilities: [
{
input: [
ModelInputType.Text,
ModelInputType.Image,
ModelInputType.Audio,
ModelInputType.File,
],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
attachments: GEMINI_ATTACHMENT_CAPABILITY,
structuredAttachments: GEMINI_ATTACHMENT_CAPABILITY,
},
],
},
@@ -62,21 +89,13 @@ export class GeminiVertexProvider extends GeminiProvider<GeminiVertexConfig> {
],
},
];
protected instance!: GoogleVertexProvider;
override configured(): boolean {
return !!this.config.location && !!this.config.googleAuthOptions;
}
protected override setup() {
super.setup();
this.instance = createVertex(this.config);
}
override async refreshOnlineModels() {
try {
const { baseUrl, headers } = await getGoogleAuth(this.config, 'google');
const { baseUrl, headers } = await this.resolveVertexAuth();
if (baseUrl && !this.onlineModelList.length) {
const { publisherModels } = await fetch(`${baseUrl}/models`, {
headers: headers(),
@@ -91,4 +110,19 @@ export class GeminiVertexProvider extends GeminiProvider<GeminiVertexConfig> {
this.logger.error('Failed to fetch available models', e);
}
}
protected async resolveVertexAuth() {
return await getGoogleAuth(this.config, 'google');
}
protected override async createNativeConfig(): Promise<NativeLlmBackendConfig> {
const auth = await this.resolveVertexAuth();
const { Authorization: authHeader } = auth.headers();
return {
base_url: auth.baseUrl || '',
auth_token: authHeader.replace(/^Bearer\s+/i, ''),
request_layer: 'gemini_vertex',
};
}
}

View File

@@ -1,4 +1,3 @@
import type { ToolSet } from 'ai';
import { z } from 'zod';
import type {
@@ -6,6 +5,11 @@ import type {
NativeLlmStreamEvent,
NativeLlmToolDefinition,
} from '../../../native';
import type {
CopilotTool,
CopilotToolExecuteOptions,
CopilotToolSet,
} from '../tools';
export type NativeDispatchFn = (
request: NativeLlmRequest,
@@ -16,6 +20,8 @@ export type NativeToolCall = {
id: string;
name: string;
args: Record<string, unknown>;
rawArgumentsText?: string;
argumentParseError?: string;
thought?: string;
};
@@ -28,10 +34,18 @@ type ToolExecutionResult = {
callId: string;
name: string;
args: Record<string, unknown>;
rawArgumentsText?: string;
argumentParseError?: string;
output: unknown;
isError?: boolean;
};
type ParsedToolArguments = {
args: Record<string, unknown>;
rawArgumentsText?: string;
argumentParseError?: string;
};
export class ToolCallAccumulator {
readonly #states = new Map<string, ToolCallState>();
@@ -51,12 +65,20 @@ export class ToolCallAccumulator {
complete(event: Extract<NativeLlmStreamEvent, { type: 'tool_call' }>) {
const state = this.#states.get(event.call_id);
this.#states.delete(event.call_id);
const parsed =
event.arguments_text !== undefined || event.arguments_error !== undefined
? {
args: event.arguments ?? {},
rawArgumentsText: event.arguments_text ?? state?.argumentsText,
argumentParseError: event.arguments_error,
}
: event.arguments
? this.parseArgs(event.arguments, state?.argumentsText)
: this.parseJson(state?.argumentsText ?? '{}');
return {
id: event.call_id,
name: event.name || state?.name || '',
args: this.parseArgs(
event.arguments ?? this.parseJson(state?.argumentsText ?? '{}')
),
...parsed,
thought: event.thought,
} satisfies NativeToolCall;
}
@@ -70,51 +92,61 @@ export class ToolCallAccumulator {
pending.push({
id: callId,
name: state.name,
args: this.parseArgs(this.parseJson(state.argumentsText)),
...this.parseJson(state.argumentsText),
});
}
this.#states.clear();
return pending;
}
private parseJson(jsonText: string): unknown {
private parseJson(jsonText: string): ParsedToolArguments {
if (!jsonText.trim()) {
return {};
return { args: {} };
}
try {
return JSON.parse(jsonText);
} catch {
return {};
return this.parseArgs(JSON.parse(jsonText), jsonText);
} catch (error) {
return {
args: {},
rawArgumentsText: jsonText,
argumentParseError:
error instanceof Error
? error.message
: 'Invalid tool arguments JSON',
};
}
}
private parseArgs(value: unknown): Record<string, unknown> {
private parseArgs(
value: unknown,
rawArgumentsText?: string
): ParsedToolArguments {
if (value && typeof value === 'object' && !Array.isArray(value)) {
return value as Record<string, unknown>;
return {
args: value as Record<string, unknown>,
rawArgumentsText,
};
}
return {};
return {
args: {},
rawArgumentsText,
argumentParseError: 'Tool arguments must be a JSON object',
};
}
}
export class ToolSchemaExtractor {
static extract(toolSet: ToolSet): NativeLlmToolDefinition[] {
static extract(toolSet: CopilotToolSet): NativeLlmToolDefinition[] {
return Object.entries(toolSet).map(([name, tool]) => {
const unknownTool = tool as Record<string, unknown>;
const inputSchema =
unknownTool.inputSchema ?? unknownTool.parameters ?? z.object({});
return {
name,
description:
typeof unknownTool.description === 'string'
? unknownTool.description
: undefined,
parameters: this.toJsonSchema(inputSchema),
description: tool.description,
parameters: this.toJsonSchema(tool.inputSchema ?? z.object({})),
};
});
}
private static toJsonSchema(schema: unknown): Record<string, unknown> {
static toJsonSchema(schema: unknown): Record<string, unknown> {
if (!(schema instanceof z.ZodType)) {
if (schema && typeof schema === 'object' && !Array.isArray(schema)) {
return schema as Record<string, unknown>;
@@ -228,14 +260,45 @@ export class ToolSchemaExtractor {
export class ToolCallLoop {
constructor(
private readonly dispatch: NativeDispatchFn,
private readonly tools: ToolSet,
private readonly tools: CopilotToolSet,
private readonly maxSteps = 20
) {}
private normalizeToolExecuteOptions(
signalOrOptions?: AbortSignal | CopilotToolExecuteOptions,
maybeMessages?: CopilotToolExecuteOptions['messages']
): CopilotToolExecuteOptions {
if (
signalOrOptions &&
typeof signalOrOptions === 'object' &&
'aborted' in signalOrOptions
) {
return {
signal: signalOrOptions,
messages: maybeMessages,
};
}
if (!signalOrOptions) {
return maybeMessages ? { messages: maybeMessages } : {};
}
return {
...signalOrOptions,
signal: signalOrOptions.signal,
messages: signalOrOptions.messages ?? maybeMessages,
};
}
async *run(
request: NativeLlmRequest,
signal?: AbortSignal
signalOrOptions?: AbortSignal | CopilotToolExecuteOptions,
maybeMessages?: CopilotToolExecuteOptions['messages']
): AsyncIterableIterator<NativeLlmStreamEvent> {
const toolExecuteOptions = this.normalizeToolExecuteOptions(
signalOrOptions,
maybeMessages
);
const messages = request.messages.map(message => ({
...message,
content: [...message.content],
@@ -253,7 +316,7 @@ export class ToolCallLoop {
stream: true,
messages,
},
signal
toolExecuteOptions.signal
)) {
switch (event.type) {
case 'tool_call_delta': {
@@ -291,7 +354,10 @@ export class ToolCallLoop {
throw new Error('ToolCallLoop max steps reached');
}
const toolResults = await this.executeTools(toolCalls);
const toolResults = await this.executeTools(
toolCalls,
toolExecuteOptions
);
messages.push({
role: 'assistant',
@@ -300,6 +366,8 @@ export class ToolCallLoop {
call_id: call.id,
name: call.name,
arguments: call.args,
arguments_text: call.rawArgumentsText,
arguments_error: call.argumentParseError,
thought: call.thought,
})),
});
@@ -311,6 +379,10 @@ export class ToolCallLoop {
{
type: 'tool_result',
call_id: result.callId,
name: result.name,
arguments: result.args,
arguments_text: result.rawArgumentsText,
arguments_error: result.argumentParseError,
output: result.output,
is_error: result.isError,
},
@@ -321,6 +393,8 @@ export class ToolCallLoop {
call_id: result.callId,
name: result.name,
arguments: result.args,
arguments_text: result.rawArgumentsText,
arguments_error: result.argumentParseError,
output: result.output,
is_error: result.isError,
};
@@ -328,24 +402,28 @@ export class ToolCallLoop {
}
}
private async executeTools(calls: NativeToolCall[]) {
return await Promise.all(calls.map(call => this.executeTool(call)));
private async executeTools(
calls: NativeToolCall[],
options: CopilotToolExecuteOptions
) {
return await Promise.all(
calls.map(call => this.executeTool(call, options))
);
}
private async executeTool(
call: NativeToolCall
call: NativeToolCall,
options: CopilotToolExecuteOptions
): Promise<ToolExecutionResult> {
const tool = this.tools[call.name] as
| {
execute?: (args: Record<string, unknown>) => Promise<unknown>;
}
| undefined;
const tool = this.tools[call.name] as CopilotTool | undefined;
if (!tool?.execute) {
return {
callId: call.id,
name: call.name,
args: call.args,
rawArgumentsText: call.rawArgumentsText,
argumentParseError: call.argumentParseError,
isError: true,
output: {
message: `Tool not found: ${call.name}`,
@@ -353,12 +431,30 @@ export class ToolCallLoop {
};
}
try {
const output = await tool.execute(call.args);
if (call.argumentParseError) {
return {
callId: call.id,
name: call.name,
args: call.args,
rawArgumentsText: call.rawArgumentsText,
argumentParseError: call.argumentParseError,
isError: true,
output: {
message: 'Invalid tool arguments JSON',
rawArguments: call.rawArgumentsText,
error: call.argumentParseError,
},
};
}
try {
const output = await tool.execute(call.args, options);
return {
callId: call.id,
name: call.name,
args: call.args,
rawArgumentsText: call.rawArgumentsText,
argumentParseError: call.argumentParseError,
output: output ?? null,
};
} catch (error) {
@@ -371,6 +467,8 @@ export class ToolCallLoop {
callId: call.id,
name: call.name,
args: call.args,
rawArgumentsText: call.rawArgumentsText,
argumentParseError: call.argumentParseError,
isError: true,
output: {
message: 'Tool execution failed',

View File

@@ -1,5 +1,3 @@
import type { ToolSet } from 'ai';
import {
CopilotProviderSideError,
metrics,
@@ -11,6 +9,7 @@ import {
type NativeLlmRequest,
} from '../../../native';
import type { NodeTextMiddleware } from '../config';
import type { CopilotToolSet } from '../tools';
import { buildNativeRequest, NativeProviderAdapter } from './native';
import { CopilotProvider } from './provider';
import type {
@@ -86,7 +85,7 @@ export class MorphProvider extends CopilotProvider<MorphConfig> {
}
private createNativeAdapter(
tools: ToolSet,
tools: CopilotToolSet,
nodeTextMiddleware?: NodeTextMiddleware[]
) {
return new NativeProviderAdapter(
@@ -108,12 +107,14 @@ export class MorphProvider extends CopilotProvider<MorphConfig> {
messages: PromptMessage[],
options: CopilotChatOptions = {}
): Promise<string> {
const fullCond = {
...cond,
outputType: ModelOutputType.Text,
};
await this.checkParams({ messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const fullCond = { ...cond, outputType: ModelOutputType.Text };
const model = this.selectModel(
await this.checkParams({
messages,
cond: fullCond,
options,
})
);
try {
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
@@ -127,7 +128,7 @@ export class MorphProvider extends CopilotProvider<MorphConfig> {
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
return await adapter.text(request, options.signal);
return await adapter.text(request, options.signal, messages);
} catch (e: any) {
metrics.ai
.counter('chat_text_errors')
@@ -141,12 +142,14 @@ export class MorphProvider extends CopilotProvider<MorphConfig> {
messages: PromptMessage[],
options: CopilotChatOptions = {}
): AsyncIterable<string> {
const fullCond = {
...cond,
outputType: ModelOutputType.Text,
};
await this.checkParams({ messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const fullCond = { ...cond, outputType: ModelOutputType.Text };
const model = this.selectModel(
await this.checkParams({
messages,
cond: fullCond,
options,
})
);
try {
metrics.ai
@@ -162,7 +165,11 @@ export class MorphProvider extends CopilotProvider<MorphConfig> {
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
for await (const chunk of adapter.streamText(request, options.signal)) {
for await (const chunk of adapter.streamText(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {

View File

@@ -1,31 +1,41 @@
import type { ToolSet } from 'ai';
import { ZodType } from 'zod';
import { CopilotPromptInvalid } from '../../../base';
import type {
NativeLlmCoreContent,
NativeLlmCoreMessage,
NativeLlmEmbeddingRequest,
NativeLlmRequest,
NativeLlmStreamEvent,
NativeLlmStructuredRequest,
NativeLlmStructuredResponse,
} from '../../../native';
import type { NodeTextMiddleware, ProviderMiddlewareConfig } from '../config';
import { NativeDispatchFn, ToolCallLoop, ToolSchemaExtractor } from './loop';
import type { CopilotChatOptions, PromptMessage, StreamObject } from './types';
import type { CopilotToolSet } from '../tools';
import {
CitationFootnoteFormatter,
inferMimeType,
TextStreamParser,
} from './utils';
const SIMPLE_IMAGE_URL_REGEX = /^(https?:\/\/|data:image\/)/;
canonicalizePromptAttachment,
type CanonicalPromptAttachment,
} from './attachments';
import { NativeDispatchFn, ToolCallLoop, ToolSchemaExtractor } from './loop';
import type {
CopilotChatOptions,
CopilotStructuredOptions,
ModelAttachmentCapability,
PromptMessage,
StreamObject,
} from './types';
import { CitationFootnoteFormatter, TextStreamParser } from './utils';
type BuildNativeRequestOptions = {
model: string;
messages: PromptMessage[];
options?: CopilotChatOptions;
tools?: ToolSet;
options?: CopilotChatOptions | CopilotStructuredOptions;
tools?: CopilotToolSet;
withAttachment?: boolean;
attachmentCapability?: ModelAttachmentCapability;
include?: string[];
reasoning?: Record<string, unknown>;
responseSchema?: unknown;
middleware?: ProviderMiddlewareConfig;
};
@@ -34,6 +44,11 @@ type BuildNativeRequestResult = {
schema?: ZodType;
};
type BuildNativeStructuredRequestResult = {
request: NativeLlmStructuredRequest;
schema: ZodType;
};
type ToolCallMeta = {
name: string;
args: Record<string, unknown>;
@@ -68,9 +83,121 @@ function roleToCore(role: PromptMessage['role']) {
}
}
function ensureAttachmentSupported(
attachment: CanonicalPromptAttachment,
attachmentCapability?: ModelAttachmentCapability
) {
if (!attachmentCapability) return;
if (!attachmentCapability.kinds.includes(attachment.kind)) {
throw new CopilotPromptInvalid(
`Native path does not support ${attachment.kind} attachments${
attachment.mediaType ? ` (${attachment.mediaType})` : ''
}`
);
}
if (
attachmentCapability.sourceKinds?.length &&
!attachmentCapability.sourceKinds.includes(attachment.sourceKind)
) {
throw new CopilotPromptInvalid(
`Native path does not support ${attachment.sourceKind} attachment sources`
);
}
if (attachment.isRemote && attachmentCapability.allowRemoteUrls === false) {
throw new CopilotPromptInvalid(
'Native path does not support remote attachment urls'
);
}
}
function resolveResponseSchema(
systemMessage: PromptMessage | undefined,
responseSchema?: unknown
): ZodType | undefined {
if (responseSchema instanceof ZodType) {
return responseSchema;
}
if (systemMessage?.responseFormat?.schema instanceof ZodType) {
return systemMessage.responseFormat.schema;
}
return systemMessage?.params?.schema instanceof ZodType
? systemMessage.params.schema
: undefined;
}
function resolveResponseStrict(
systemMessage: PromptMessage | undefined,
options?: CopilotStructuredOptions
) {
return options?.strict ?? systemMessage?.responseFormat?.strict ?? true;
}
export class StructuredResponseParseError extends Error {}
function normalizeStructuredText(text: string) {
const trimmed = text.replaceAll(/^ny\n/g, ' ').trim();
if (trimmed.startsWith('```') || trimmed.endsWith('```')) {
return trimmed
.replace(/```[\w\s-]*\n/g, '')
.replace(/\n```/g, '')
.trim();
}
return trimmed;
}
export function parseNativeStructuredOutput(
response: Pick<NativeLlmStructuredResponse, 'output_text'> & {
output_json?: unknown;
}
) {
if (response.output_json !== undefined) {
return response.output_json;
}
const normalized = normalizeStructuredText(response.output_text);
const candidates = [
() => normalized,
() => {
const objectStart = normalized.indexOf('{');
const objectEnd = normalized.lastIndexOf('}');
return objectStart !== -1 && objectEnd > objectStart
? normalized.slice(objectStart, objectEnd + 1)
: null;
},
() => {
const arrayStart = normalized.indexOf('[');
const arrayEnd = normalized.lastIndexOf(']');
return arrayStart !== -1 && arrayEnd > arrayStart
? normalized.slice(arrayStart, arrayEnd + 1)
: null;
},
];
for (const candidate of candidates) {
try {
const candidateText = candidate();
if (typeof candidateText === 'string') {
return JSON.parse(candidateText);
}
} catch {
continue;
}
}
throw new StructuredResponseParseError(
`Unexpected structured response: ${normalized.slice(0, 200)}`
);
}
async function toCoreContents(
message: PromptMessage,
withAttachment: boolean
withAttachment: boolean,
attachmentCapability?: ModelAttachmentCapability
): Promise<NativeLlmCoreContent[]> {
const contents: NativeLlmCoreContent[] = [];
@@ -81,24 +208,12 @@ async function toCoreContents(
if (!withAttachment || !Array.isArray(message.attachments)) return contents;
for (const entry of message.attachments) {
let attachmentUrl: string;
let mediaType: string;
if (typeof entry === 'string') {
attachmentUrl = entry;
mediaType =
typeof message.params?.mimetype === 'string'
? message.params.mimetype
: await inferMimeType(entry);
} else {
attachmentUrl = entry.attachment;
mediaType = entry.mimeType;
}
if (!SIMPLE_IMAGE_URL_REGEX.test(attachmentUrl)) continue;
if (!mediaType.startsWith('image/')) continue;
contents.push({ type: 'image', source: { url: attachmentUrl } });
const normalized = await canonicalizePromptAttachment(entry, message);
ensureAttachmentSupported(normalized, attachmentCapability);
contents.push({
type: normalized.kind,
source: normalized.source,
});
}
return contents;
@@ -110,8 +225,10 @@ export async function buildNativeRequest({
options = {},
tools = {},
withAttachment = true,
attachmentCapability,
include,
reasoning,
responseSchema,
middleware,
}: BuildNativeRequestOptions): Promise<BuildNativeRequestResult> {
const copiedMessages = messages.map(message => ({
@@ -123,10 +240,7 @@ export async function buildNativeRequest({
const systemMessage =
copiedMessages[0]?.role === 'system' ? copiedMessages.shift() : undefined;
const schema =
systemMessage?.params?.schema instanceof ZodType
? systemMessage.params.schema
: undefined;
const schema = resolveResponseSchema(systemMessage, responseSchema);
const coreMessages: NativeLlmCoreMessage[] = [];
if (systemMessage?.content?.length) {
@@ -138,7 +252,11 @@ export async function buildNativeRequest({
for (const message of copiedMessages) {
if (message.role === 'system') continue;
const content = await toCoreContents(message, withAttachment);
const content = await toCoreContents(
message,
withAttachment,
attachmentCapability
);
coreMessages.push({ role: roleToCore(message.role), content });
}
@@ -153,6 +271,9 @@ export async function buildNativeRequest({
tool_choice: Object.keys(tools).length ? 'auto' : undefined,
include,
reasoning,
response_schema: schema
? ToolSchemaExtractor.toJsonSchema(schema)
: undefined,
middleware: middleware?.rust
? { request: middleware.rust.request, stream: middleware.rust.stream }
: undefined,
@@ -161,6 +282,90 @@ export async function buildNativeRequest({
};
}
export async function buildNativeStructuredRequest({
model,
messages,
options = {},
withAttachment = true,
attachmentCapability,
reasoning,
responseSchema,
middleware,
}: Omit<
BuildNativeRequestOptions,
'tools' | 'include'
>): Promise<BuildNativeStructuredRequestResult> {
const copiedMessages = messages.map(message => ({
...message,
attachments: message.attachments
? [...message.attachments]
: message.attachments,
}));
const systemMessage =
copiedMessages[0]?.role === 'system' ? copiedMessages.shift() : undefined;
const schema = resolveResponseSchema(systemMessage, responseSchema);
const strict = resolveResponseStrict(systemMessage, options);
if (!schema) {
throw new CopilotPromptInvalid('Schema is required');
}
const coreMessages: NativeLlmCoreMessage[] = [];
if (systemMessage?.content?.length) {
coreMessages.push({
role: 'system',
content: [{ type: 'text', text: systemMessage.content }],
});
}
for (const message of copiedMessages) {
if (message.role === 'system') continue;
const content = await toCoreContents(
message,
withAttachment,
attachmentCapability
);
coreMessages.push({ role: roleToCore(message.role), content });
}
return {
request: {
model,
messages: coreMessages,
schema: ToolSchemaExtractor.toJsonSchema(schema),
max_tokens: options.maxTokens ?? undefined,
temperature: options.temperature ?? undefined,
reasoning,
strict,
response_mime_type: 'application/json',
middleware: middleware?.rust
? { request: middleware.rust.request }
: undefined,
},
schema,
};
}
export function buildNativeEmbeddingRequest({
model,
inputs,
dimensions,
taskType = 'RETRIEVAL_DOCUMENT',
}: {
model: string;
inputs: string[];
dimensions?: number;
taskType?: string;
}): NativeLlmEmbeddingRequest {
return {
model,
inputs,
dimensions,
task_type: taskType,
};
}
function ensureToolResultMeta(
event: Extract<NativeLlmStreamEvent, { type: 'tool_result' }>,
toolCalls: Map<string, ToolCallMeta>
@@ -244,7 +449,7 @@ export class NativeProviderAdapter {
constructor(
dispatch: NativeDispatchFn,
tools: ToolSet,
tools: CopilotToolSet,
maxSteps = 20,
options: NativeProviderAdapterOptions = {}
) {
@@ -259,9 +464,13 @@ export class NativeProviderAdapter {
enabledNodeTextMiddlewares.has('citation_footnote');
}
async text(request: NativeLlmRequest, signal?: AbortSignal) {
async text(
request: NativeLlmRequest,
signal?: AbortSignal,
messages?: PromptMessage[]
) {
let output = '';
for await (const chunk of this.streamText(request, signal)) {
for await (const chunk of this.streamText(request, signal, messages)) {
output += chunk;
}
return output.trim();
@@ -269,7 +478,8 @@ export class NativeProviderAdapter {
async *streamText(
request: NativeLlmRequest,
signal?: AbortSignal
signal?: AbortSignal,
messages?: PromptMessage[]
): AsyncIterableIterator<string> {
const textParser = this.#enableCallout ? new TextStreamParser() : null;
const citationFormatter = this.#enableCitationFootnote
@@ -278,7 +488,7 @@ export class NativeProviderAdapter {
const toolCalls = new Map<string, ToolCallMeta>();
let streamPartId = 0;
for await (const event of this.#loop.run(request, signal)) {
for await (const event of this.#loop.run(request, signal, messages)) {
switch (event.type) {
case 'text_delta': {
if (textParser) {
@@ -364,7 +574,8 @@ export class NativeProviderAdapter {
async *streamObject(
request: NativeLlmRequest,
signal?: AbortSignal
signal?: AbortSignal,
messages?: PromptMessage[]
): AsyncIterableIterator<StreamObject> {
const toolCalls = new Map<string, ToolCallMeta>();
const citationFormatter = this.#enableCitationFootnote
@@ -373,7 +584,7 @@ export class NativeProviderAdapter {
const fallbackAttachmentFootnotes = new Map<string, AttachmentFootnote>();
let hasFootnoteReference = false;
for await (const event of this.#loop.run(request, signal)) {
for await (const event of this.#loop.run(request, signal, messages)) {
switch (event.type) {
case 'text_delta': {
if (event.text.includes('[^')) {

View File

@@ -1,4 +1,3 @@
import type { Tool, ToolSet } from 'ai';
import { z } from 'zod';
import {
@@ -12,27 +11,68 @@ import {
} from '../../../base';
import {
llmDispatchStream,
llmEmbeddingDispatch,
llmRerankDispatch,
llmStructuredDispatch,
type NativeLlmBackendConfig,
type NativeLlmEmbeddingRequest,
type NativeLlmRequest,
type NativeLlmRerankRequest,
type NativeLlmRerankResponse,
type NativeLlmStructuredRequest,
} from '../../../native';
import type { NodeTextMiddleware } from '../config';
import { buildNativeRequest, NativeProviderAdapter } from './native';
import type { CopilotTool, CopilotToolSet } from '../tools';
import { IMAGE_ATTACHMENT_CAPABILITY } from './attachments';
import {
buildNativeEmbeddingRequest,
buildNativeRequest,
buildNativeStructuredRequest,
NativeProviderAdapter,
parseNativeStructuredOutput,
} from './native';
import { CopilotProvider } from './provider';
import type {
CopilotChatOptions,
CopilotChatTools,
CopilotEmbeddingOptions,
CopilotImageOptions,
CopilotRerankRequest,
CopilotStructuredOptions,
ModelCapability,
ModelConditions,
PromptMessage,
StreamObject,
} from './types';
import { CopilotProviderType, ModelInputType, ModelOutputType } from './types';
import { chatToGPTMessage } from './utils';
import { promptAttachmentToUrl } from './utils';
export const DEFAULT_DIMENSIONS = 256;
const GPT_5_SAMPLING_UNSUPPORTED_MODELS = /^(gpt-5(?:$|[.-]))/;
export function normalizeOpenAIOptionsForModel<
T extends {
frequencyPenalty?: number | null;
presencePenalty?: number | null;
temperature?: number | null;
topP?: number | null;
},
>(options: T, model: string): T {
if (!GPT_5_SAMPLING_UNSUPPORTED_MODELS.test(model)) {
return options;
}
const normalizedOptions = { ...options };
delete normalizedOptions.frequencyPenalty;
delete normalizedOptions.presencePenalty;
delete normalizedOptions.temperature;
delete normalizedOptions.topP;
return normalizedOptions;
}
export type OpenAIConfig = {
apiKey: string;
baseURL?: string;
@@ -61,19 +101,6 @@ const ImageResponseSchema = z.union([
}),
}),
]);
const LogProbsSchema = z.array(
z.object({
token: z.string(),
logprob: z.number(),
top_logprobs: z.array(
z.object({
token: z.string(),
logprob: z.number(),
})
),
})
);
const TRUSTED_ATTACHMENT_HOST_SUFFIXES = ['cdn.affine.pro'];
function normalizeImageFormatToMime(format?: string) {
@@ -106,6 +133,34 @@ function normalizeImageResponseData(
.filter((value): value is string => typeof value === 'string');
}
function buildOpenAIRerankRequest(
model: string,
request: CopilotRerankRequest
): NativeLlmRerankRequest {
return {
model,
query: request.query,
candidates: request.candidates.map(candidate => ({
...(candidate.id ? { id: candidate.id } : {}),
text: candidate.text,
})),
...(request.topK ? { top_n: request.topK } : {}),
};
}
function createOpenAIMultimodalCapability(
output: ModelCapability['output'],
options: Pick<ModelCapability, 'defaultForOutputType'> = {}
): ModelCapability {
return {
input: [ModelInputType.Text, ModelInputType.Image],
output,
attachments: IMAGE_ATTACHMENT_CAPABILITY,
structuredAttachments: IMAGE_ATTACHMENT_CAPABILITY,
...options,
};
}
export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
readonly type = CopilotProviderType.OpenAI;
@@ -115,10 +170,10 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
name: 'GPT 4o',
id: 'gpt-4o',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
// FIXME(@darkskygit): deprecated
@@ -126,20 +181,20 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
name: 'GPT 4o 2024-08-06',
id: 'gpt-4o-2024-08-06',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
{
name: 'GPT 4o Mini',
id: 'gpt-4o-mini',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
// FIXME(@darkskygit): deprecated
@@ -147,153 +202,158 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
name: 'GPT 4o Mini 2024-07-18',
id: 'gpt-4o-mini-2024-07-18',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
{
name: 'GPT 4.1',
id: 'gpt-4.1',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
createOpenAIMultimodalCapability(
[
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Rerank,
ModelOutputType.Structured,
],
defaultForOutputType: true,
},
{ defaultForOutputType: true }
),
],
},
{
name: 'GPT 4.1 2025-04-14',
id: 'gpt-4.1-2025-04-14',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Rerank,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 4.1 Mini',
id: 'gpt-4.1-mini',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Rerank,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 4.1 Nano',
id: 'gpt-4.1-nano',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Rerank,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 5',
id: 'gpt-5',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 5 2025-08-07',
id: 'gpt-5-2025-08-07',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 5 Mini',
id: 'gpt-5-mini',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 5.2',
id: 'gpt-5.2',
capabilities: [
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Rerank,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 5.2 2025-12-11',
id: 'gpt-5.2-2025-12-11',
capabilities: [
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT 5 Nano',
id: 'gpt-5-nano',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
ModelOutputType.Structured,
]),
],
},
{
name: 'GPT O1',
id: 'o1',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
{
name: 'GPT O3',
id: 'o3',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
{
name: 'GPT O4 Mini',
id: 'o4-mini',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Text, ModelOutputType.Object],
},
createOpenAIMultimodalCapability([
ModelOutputType.Text,
ModelOutputType.Object,
]),
],
},
// Embedding models
@@ -329,11 +389,9 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
{
id: 'gpt-image-1',
capabilities: [
{
input: [ModelInputType.Text, ModelInputType.Image],
output: [ModelOutputType.Image],
createOpenAIMultimodalCapability([ModelOutputType.Image], {
defaultForOutputType: true,
},
}),
],
},
];
@@ -379,7 +437,7 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
override getProviderSpecificTools(
toolName: CopilotChatTools,
_model: string
): [string, Tool?] | undefined {
): [string, CopilotTool?] | undefined {
if (toolName === 'docEdit') {
return ['doc_edit', undefined];
}
@@ -394,14 +452,18 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
};
}
private getNativeProtocol() {
return this.config.oldApiStyle ? 'openai_chat' : 'openai_responses';
}
private createNativeAdapter(
tools: ToolSet,
tools: CopilotToolSet,
nodeTextMiddleware?: NodeTextMiddleware[]
) {
return new NativeProviderAdapter(
(request: NativeLlmRequest, signal?: AbortSignal) =>
llmDispatchStream(
this.config.oldApiStyle ? 'openai_chat' : 'openai_responses',
this.getNativeProtocol(),
this.createNativeConfig(),
request,
signal
@@ -412,6 +474,27 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
);
}
protected createNativeStructuredDispatch(
backendConfig: NativeLlmBackendConfig
) {
return (request: NativeLlmStructuredRequest) =>
llmStructuredDispatch(this.getNativeProtocol(), backendConfig, request);
}
protected createNativeEmbeddingDispatch(
backendConfig: NativeLlmBackendConfig
) {
return (request: NativeLlmEmbeddingRequest) =>
llmEmbeddingDispatch(this.getNativeProtocol(), backendConfig, request);
}
protected createNativeRerankDispatch(backendConfig: NativeLlmBackendConfig) {
return (
request: NativeLlmRerankRequest
): Promise<NativeLlmRerankResponse> =>
llmRerankDispatch('openai_chat', backendConfig, request);
}
private getReasoning(
options: NonNullable<CopilotChatOptions>,
model: string
@@ -428,24 +511,34 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
options: CopilotChatOptions = {}
): Promise<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
messages,
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Text);
const normalizedOptions = normalizeOpenAIOptionsForModel(
options,
model.id
);
const { request } = await buildNativeRequest({
model: model.id,
messages,
options,
options: normalizedOptions,
tools,
attachmentCapability: cap,
include: options.webSearch ? ['citations'] : undefined,
reasoning: this.getReasoning(options, model.id),
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
return await adapter.text(request, options.signal);
return await adapter.text(request, options.signal, messages);
} catch (e: any) {
metrics.ai
.counter('chat_text_errors')
@@ -463,8 +556,12 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
...cond,
outputType: ModelOutputType.Text,
};
await this.checkParams({ messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
messages,
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
@@ -472,17 +569,27 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
.add(1, this.metricLabels(model.id));
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Text);
const normalizedOptions = normalizeOpenAIOptionsForModel(
options,
model.id
);
const { request } = await buildNativeRequest({
model: model.id,
messages,
options,
options: normalizedOptions,
tools,
attachmentCapability: cap,
include: options.webSearch ? ['citations'] : undefined,
reasoning: this.getReasoning(options, model.id),
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
for await (const chunk of adapter.streamText(request, options.signal)) {
for await (const chunk of adapter.streamText(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {
@@ -499,8 +606,12 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
options: CopilotChatOptions = {}
): AsyncIterable<StreamObject> {
const fullCond = { ...cond, outputType: ModelOutputType.Object };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
@@ -508,17 +619,27 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
.add(1, this.metricLabels(model.id));
const tools = await this.getTools(options, model.id);
const middleware = this.getActiveProviderMiddleware();
const cap = this.getAttachCapability(model, ModelOutputType.Object);
const normalizedOptions = normalizeOpenAIOptionsForModel(
options,
model.id
);
const { request } = await buildNativeRequest({
model: model.id,
messages,
options,
options: normalizedOptions,
tools,
attachmentCapability: cap,
include: options.webSearch ? ['citations'] : undefined,
reasoning: this.getReasoning(options, model.id),
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
for await (const chunk of adapter.streamObject(request, options.signal)) {
for await (const chunk of adapter.streamObject(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {
@@ -535,27 +656,34 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
options: CopilotStructuredOptions = {}
): Promise<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Structured };
await this.checkParams({ messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
messages,
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, { model: model.id });
const tools = await this.getTools(options, model.id);
const backendConfig = this.createNativeConfig();
const middleware = this.getActiveProviderMiddleware();
const { request, schema } = await buildNativeRequest({
const cap = this.getAttachCapability(model, ModelOutputType.Structured);
const normalizedOptions = normalizeOpenAIOptionsForModel(
options,
model.id
);
const { request, schema } = await buildNativeStructuredRequest({
model: model.id,
messages,
options,
tools,
options: normalizedOptions,
attachmentCapability: cap,
reasoning: this.getReasoning(options, model.id),
responseSchema: options.schema,
middleware,
});
if (!schema) {
throw new CopilotPromptInvalid('Schema is required');
}
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
const text = await adapter.text(request, options.signal);
const parsed = JSON.parse(text);
const response =
await this.createNativeStructuredDispatch(backendConfig)(request);
const parsed = parseNativeStructuredOutput(response);
const validated = schema.parse(parsed);
return JSON.stringify(validated);
} catch (e: any) {
@@ -566,65 +694,26 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
override async rerank(
cond: ModelConditions,
chunkMessages: PromptMessage[][],
request: CopilotRerankRequest,
options: CopilotChatOptions = {}
): Promise<number[]> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ messages: [], cond: fullCond, options });
const model = this.selectModel(fullCond);
const fullCond = { ...cond, outputType: ModelOutputType.Rerank };
const normalizedCond = await this.checkParams({
messages: [],
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
const scores = await Promise.all(
chunkMessages.map(async messages => {
const [system, msgs] = await chatToGPTMessage(messages);
const response = await this.requestOpenAIJson(
'/chat/completions',
{
model: model.id,
messages: this.toOpenAIChatMessages(system, msgs),
temperature: 0,
max_tokens: 16,
logprobs: true,
top_logprobs: 16,
},
options.signal
);
const logprobs = response?.choices?.[0]?.logprobs?.content;
if (!Array.isArray(logprobs) || logprobs.length === 0) {
return 0;
}
const parsedLogprobs = LogProbsSchema.parse(logprobs);
const topMap = parsedLogprobs[0].top_logprobs.reduce(
(acc, { token, logprob }) => ({ ...acc, [token]: logprob }),
{} as Record<string, number>
);
const findLogProb = (token: string): number => {
// OpenAI often includes a leading space, so try matching '.yes', '_yes', ' yes' and 'yes'
return [...'_:. "-\t,(=_“'.split('').map(c => c + token), token]
.flatMap(v => [v, v.toLowerCase(), v.toUpperCase()])
.reduce<number>(
(best, key) =>
(topMap[key] ?? Number.NEGATIVE_INFINITY) > best
? topMap[key]
: best,
Number.NEGATIVE_INFINITY
);
};
const logYes = findLogProb('Yes');
const logNo = findLogProb('No');
const pYes = Math.exp(logYes);
const pNo = Math.exp(logNo);
const prob = pYes + pNo === 0 ? 0 : pYes / (pYes + pNo);
return prob;
})
);
return scores;
try {
const backendConfig = this.createNativeConfig();
const nativeRequest = buildOpenAIRerankRequest(model.id, request);
const response =
await this.createNativeRerankDispatch(backendConfig)(nativeRequest);
return response.scores;
} catch (e: any) {
throw this.handleError(e);
}
}
// ====== text to image ======
@@ -826,7 +915,8 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
form.set('output_format', outputFormat);
for (const [idx, entry] of attachments.entries()) {
const url = typeof entry === 'string' ? entry : entry.attachment;
const url = promptAttachmentToUrl(entry);
if (!url) continue;
try {
const attachment = await this.fetchImage(url, maxBytes, signal);
if (!attachment) continue;
@@ -884,8 +974,12 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
options: CopilotImageOptions = {}
) {
const fullCond = { ...cond, outputType: ModelOutputType.Image };
await this.checkParams({ messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
messages,
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
metrics.ai
.counter('generate_images_stream_calls')
@@ -937,65 +1031,36 @@ export class OpenAIProvider extends CopilotProvider<OpenAIConfig> {
messages: string | string[],
options: CopilotEmbeddingOptions = { dimensions: DEFAULT_DIMENSIONS }
): Promise<number[][]> {
messages = Array.isArray(messages) ? messages : [messages];
const input = Array.isArray(messages) ? messages : [messages];
const fullCond = { ...cond, outputType: ModelOutputType.Embedding };
await this.checkParams({ embeddings: messages, cond: fullCond, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
embeddings: input,
cond: fullCond,
options,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
.counter('generate_embedding_calls')
.add(1, { model: model.id });
const response = await this.requestOpenAIJson('/embeddings', {
model: model.id,
input: messages,
dimensions: options.dimensions || DEFAULT_DIMENSIONS,
});
const data = Array.isArray(response?.data) ? response.data : [];
return data
.map((item: any) => item?.embedding)
.filter((embedding: unknown) => Array.isArray(embedding)) as number[][];
.add(1, this.metricLabels(model.id));
const backendConfig = this.createNativeConfig();
const response = await this.createNativeEmbeddingDispatch(backendConfig)(
buildNativeEmbeddingRequest({
model: model.id,
inputs: input,
dimensions: options.dimensions || DEFAULT_DIMENSIONS,
})
);
return response.embeddings;
} catch (e: any) {
metrics.ai
.counter('generate_embedding_errors')
.add(1, { model: model.id });
.add(1, this.metricLabels(model.id));
throw this.handleError(e);
}
}
private toOpenAIChatMessages(
system: string | undefined,
messages: Awaited<ReturnType<typeof chatToGPTMessage>>[1]
) {
const result: Array<{ role: string; content: string }> = [];
if (system) {
result.push({ role: 'system', content: system });
}
for (const message of messages) {
if (typeof message.content === 'string') {
result.push({ role: message.role, content: message.content });
continue;
}
const text = message.content
.filter(
part =>
part &&
typeof part === 'object' &&
'type' in part &&
part.type === 'text' &&
'text' in part
)
.map(part => String((part as { text: string }).text))
.join('\n');
result.push({ role: message.role, content: text || '[no content]' });
}
return result;
}
private async requestOpenAIJson(
path: string,
body: Record<string, unknown>,

View File

@@ -1,5 +1,3 @@
import type { ToolSet } from 'ai';
import { CopilotProviderSideError, metrics } from '../../../base';
import {
llmDispatchStream,
@@ -7,6 +5,7 @@ import {
type NativeLlmRequest,
} from '../../../native';
import type { NodeTextMiddleware } from '../config';
import type { CopilotToolSet } from '../tools';
import { buildNativeRequest, NativeProviderAdapter } from './native';
import { CopilotProvider } from './provider';
import {
@@ -87,7 +86,7 @@ export class PerplexityProvider extends CopilotProvider<PerplexityConfig> {
}
private createNativeAdapter(
tools: ToolSet,
tools: CopilotToolSet,
nodeTextMiddleware?: NodeTextMiddleware[]
) {
return new NativeProviderAdapter(
@@ -110,8 +109,13 @@ export class PerplexityProvider extends CopilotProvider<PerplexityConfig> {
options: CopilotChatOptions = {}
): Promise<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
withAttachment: false,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai.counter('chat_text_calls').add(1, this.metricLabels(model.id));
@@ -128,7 +132,7 @@ export class PerplexityProvider extends CopilotProvider<PerplexityConfig> {
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
return await adapter.text(request, options.signal);
return await adapter.text(request, options.signal, messages);
} catch (e: any) {
metrics.ai
.counter('chat_text_errors')
@@ -143,8 +147,13 @@ export class PerplexityProvider extends CopilotProvider<PerplexityConfig> {
options: CopilotChatOptions = {}
): AsyncIterable<string> {
const fullCond = { ...cond, outputType: ModelOutputType.Text };
await this.checkParams({ cond: fullCond, messages, options });
const model = this.selectModel(fullCond);
const normalizedCond = await this.checkParams({
cond: fullCond,
messages,
options,
withAttachment: false,
});
const model = this.selectModel(normalizedCond);
try {
metrics.ai
@@ -163,7 +172,11 @@ export class PerplexityProvider extends CopilotProvider<PerplexityConfig> {
middleware,
});
const adapter = this.createNativeAdapter(tools, middleware.node?.text);
for await (const chunk of adapter.streamText(request, options.signal)) {
for await (const chunk of adapter.streamText(
request,
options.signal,
messages
)) {
yield chunk;
}
} catch (e: any) {

View File

@@ -51,13 +51,21 @@ const DEFAULT_MIDDLEWARE_BY_TYPE: Record<
},
},
[CopilotProviderType.Gemini]: {
rust: {
request: ['normalize_messages', 'tool_schema_rewrite'],
stream: ['stream_event_normalize', 'citation_indexing'],
},
node: {
text: ['callout'],
text: ['citation_footnote', 'callout'],
},
},
[CopilotProviderType.GeminiVertex]: {
rust: {
request: ['normalize_messages', 'tool_schema_rewrite'],
stream: ['stream_event_normalize', 'citation_indexing'],
},
node: {
text: ['callout'],
text: ['citation_footnote', 'callout'],
},
},
[CopilotProviderType.FAL]: {},

View File

@@ -5,7 +5,7 @@ import type {
ProviderMiddlewareConfig,
} from '../config';
import { resolveProviderMiddleware } from './provider-middleware';
import { CopilotProviderType, type ModelOutputType } from './types';
import { CopilotProviderType, ModelOutputType } from './types';
const PROVIDER_ID_PATTERN = /^[a-zA-Z0-9-_]+$/;
@@ -239,8 +239,13 @@ export function resolveModel({
};
}
const defaultProviderId =
outputType && outputType !== ModelOutputType.Rerank
? registry.defaults[outputType]
: undefined;
const fallbackOrder = [
...(outputType ? [registry.defaults[outputType]] : []),
...(defaultProviderId ? [defaultProviderId] : []),
registry.defaults.fallback,
...registry.order,
].filter((id): id is string => !!id);

View File

@@ -2,7 +2,6 @@ import { AsyncLocalStorage } from 'node:async_hooks';
import { Inject, Injectable, Logger } from '@nestjs/common';
import { ModuleRef } from '@nestjs/core';
import { Tool, ToolSet } from 'ai';
import { z } from 'zod';
import {
@@ -27,6 +26,8 @@ import {
buildDocSearchGetter,
buildDocUpdateHandler,
buildDocUpdateMetaHandler,
type CopilotTool,
type CopilotToolSet,
createBlobReadTool,
createCodeArtifactTool,
createConversationSummaryTool,
@@ -42,6 +43,7 @@ import {
createExaSearchTool,
createSectionEditTool,
} from '../tools';
import { canonicalizePromptAttachment } from './attachments';
import { CopilotProviderFactory } from './factory';
import { resolveProviderMiddleware } from './provider-middleware';
import { buildProviderRegistry } from './provider-registry';
@@ -52,12 +54,17 @@ import {
type CopilotImageOptions,
CopilotProviderModel,
CopilotProviderType,
type CopilotRerankRequest,
CopilotStructuredOptions,
EmbeddingMessage,
type ModelAttachmentCapability,
ModelCapability,
ModelConditions,
ModelFullConditions,
ModelInputType,
ModelOutputType,
type PromptAttachmentKind,
type PromptAttachmentSourceKind,
type PromptMessage,
PromptMessageSchema,
StreamObject,
@@ -163,6 +170,163 @@ export abstract class CopilotProvider<C = any> {
async refreshOnlineModels() {}
private unique<T>(values: Iterable<T>) {
return Array.from(new Set(values));
}
private attachmentKindToInputType(
kind: PromptAttachmentKind
): ModelInputType {
switch (kind) {
case 'image':
return ModelInputType.Image;
case 'audio':
return ModelInputType.Audio;
default:
return ModelInputType.File;
}
}
protected async inferModelConditionsFromMessages(
messages?: PromptMessage[],
withAttachment = true
): Promise<Partial<ModelFullConditions>> {
if (!messages?.length || !withAttachment) return {};
const attachmentKinds: PromptAttachmentKind[] = [];
const attachmentSourceKinds: PromptAttachmentSourceKind[] = [];
const inputTypes: ModelInputType[] = [];
let hasRemoteAttachments = false;
for (const message of messages) {
if (!Array.isArray(message.attachments)) continue;
for (const attachment of message.attachments) {
const normalized = await canonicalizePromptAttachment(
attachment,
message
);
attachmentKinds.push(normalized.kind);
inputTypes.push(this.attachmentKindToInputType(normalized.kind));
attachmentSourceKinds.push(normalized.sourceKind);
hasRemoteAttachments = hasRemoteAttachments || normalized.isRemote;
}
}
return {
...(attachmentKinds.length
? { attachmentKinds: this.unique(attachmentKinds) }
: {}),
...(attachmentSourceKinds.length
? { attachmentSourceKinds: this.unique(attachmentSourceKinds) }
: {}),
...(inputTypes.length ? { inputTypes: this.unique(inputTypes) } : {}),
...(hasRemoteAttachments ? { hasRemoteAttachments } : {}),
};
}
private mergeModelConditions(
cond: ModelFullConditions,
inferredCond: Partial<ModelFullConditions>
): ModelFullConditions {
return {
...inferredCond,
...cond,
inputTypes: this.unique([
...(inferredCond.inputTypes ?? []),
...(cond.inputTypes ?? []),
]),
attachmentKinds: this.unique([
...(inferredCond.attachmentKinds ?? []),
...(cond.attachmentKinds ?? []),
]),
attachmentSourceKinds: this.unique([
...(inferredCond.attachmentSourceKinds ?? []),
...(cond.attachmentSourceKinds ?? []),
]),
hasRemoteAttachments:
cond.hasRemoteAttachments ?? inferredCond.hasRemoteAttachments,
};
}
protected getAttachCapability(
model: CopilotProviderModel,
outputType: ModelOutputType
): ModelAttachmentCapability | undefined {
const capability =
model.capabilities.find(cap => cap.output.includes(outputType)) ??
model.capabilities[0];
if (!capability) {
return;
}
return this.resolveAttachmentCapability(capability, outputType);
}
private resolveAttachmentCapability(
cap: ModelCapability,
outputType?: ModelOutputType
): ModelAttachmentCapability | undefined {
if (outputType === ModelOutputType.Structured) {
return cap.structuredAttachments ?? cap.attachments;
}
return cap.attachments;
}
private matchesAttachCapability(
cap: ModelCapability,
cond: ModelFullConditions
) {
const {
attachmentKinds,
attachmentSourceKinds,
hasRemoteAttachments,
outputType,
} = cond;
if (
!attachmentKinds?.length &&
!attachmentSourceKinds?.length &&
!hasRemoteAttachments
) {
return true;
}
const attachmentCapability = this.resolveAttachmentCapability(
cap,
outputType
);
if (!attachmentCapability) {
return !attachmentKinds?.some(
kind => !cap.input.includes(this.attachmentKindToInputType(kind))
);
}
if (
attachmentKinds?.some(kind => !attachmentCapability.kinds.includes(kind))
) {
return false;
}
if (
attachmentSourceKinds?.length &&
attachmentCapability.sourceKinds?.length &&
attachmentSourceKinds.some(
kind => !attachmentCapability.sourceKinds?.includes(kind)
)
) {
return false;
}
if (
hasRemoteAttachments &&
attachmentCapability.allowRemoteUrls === false
) {
return false;
}
return true;
}
private findValidModel(
cond: ModelFullConditions
): CopilotProviderModel | undefined {
@@ -170,7 +334,8 @@ export abstract class CopilotProvider<C = any> {
const matcher = (cap: ModelCapability) =>
(!outputType || cap.output.includes(outputType)) &&
(!inputTypes?.length ||
inputTypes.every(type => cap.input.includes(type)));
inputTypes.every(type => cap.input.includes(type))) &&
this.matchesAttachCapability(cap, cond);
if (modelId) {
const hasOnlineModel = this.onlineModelList.includes(modelId);
@@ -213,7 +378,7 @@ export abstract class CopilotProvider<C = any> {
protected getProviderSpecificTools(
_toolName: CopilotChatTools,
_model: string
): [string, Tool?] | undefined {
): [string, CopilotTool?] | undefined {
return;
}
@@ -221,8 +386,8 @@ export abstract class CopilotProvider<C = any> {
protected async getTools(
options: CopilotChatOptions,
model: string
): Promise<ToolSet> {
const tools: ToolSet = {};
): Promise<CopilotToolSet> {
const tools: CopilotToolSet = {};
if (options?.tools?.length) {
this.logger.debug(`getTools: ${JSON.stringify(options.tools)}`);
const ac = this.moduleRef.get(AccessController, { strict: false });
@@ -377,19 +542,14 @@ export abstract class CopilotProvider<C = any> {
messages,
embeddings,
options = {},
withAttachment = true,
}: {
cond: ModelFullConditions;
messages?: PromptMessage[];
embeddings?: string[];
options?: CopilotChatOptions;
}) {
const model = this.selectModel(cond);
const multimodal = model.capabilities.some(c =>
[ModelInputType.Image, ModelInputType.Audio].some(t =>
c.input.includes(t)
)
);
options?: CopilotChatOptions | CopilotStructuredOptions;
withAttachment?: boolean;
}): Promise<ModelFullConditions> {
if (messages) {
const { requireContent = true, requireAttachment = false } = options;
@@ -402,20 +562,56 @@ export abstract class CopilotProvider<C = any> {
})
.passthrough()
.catchall(z.union([z.string(), z.number(), z.date(), z.null()]))
.refine(
m =>
!(multimodal && requireAttachment && m.role === 'user') ||
(m.attachments ? m.attachments.length > 0 : true),
{ message: 'attachments required in multimodal mode' }
)
)
.optional();
this.handleZodError(MessageSchema.safeParse(messages));
const inferredCond = await this.inferModelConditionsFromMessages(
messages,
withAttachment
);
const mergedCond = this.mergeModelConditions(cond, inferredCond);
const model = this.selectModel(mergedCond);
const multimodal = model.capabilities.some(c =>
[ModelInputType.Image, ModelInputType.Audio, ModelInputType.File].some(
t => c.input.includes(t)
)
);
if (
multimodal &&
requireAttachment &&
!messages.some(
message =>
message.role === 'user' &&
Array.isArray(message.attachments) &&
message.attachments.length > 0
)
) {
throw new CopilotPromptInvalid(
'attachments required in multimodal mode'
);
}
if (embeddings) {
this.handleZodError(EmbeddingMessage.safeParse(embeddings));
}
return mergedCond;
}
const inferredCond = await this.inferModelConditionsFromMessages(
messages,
withAttachment
);
const mergedCond = this.mergeModelConditions(cond, inferredCond);
if (embeddings) {
this.handleZodError(EmbeddingMessage.safeParse(embeddings));
}
return mergedCond;
}
abstract text(
@@ -476,7 +672,7 @@ export abstract class CopilotProvider<C = any> {
async rerank(
_model: ModelConditions,
_messages: PromptMessage[][],
_request: CopilotRerankRequest,
_options?: CopilotChatOptions
): Promise<number[]> {
throw new CopilotProviderNotSupported({

View File

@@ -124,14 +124,97 @@ export const ChatMessageRole = Object.values(AiPromptRole) as [
'user',
];
const AttachmentUrlSchema = z.string().refine(value => {
if (value.startsWith('data:')) {
return true;
}
try {
const url = new URL(value);
return (
url.protocol === 'http:' ||
url.protocol === 'https:' ||
url.protocol === 'gs:'
);
} catch {
return false;
}
}, 'attachments must use https?://, gs:// or data: urls');
export const PromptAttachmentSourceKindSchema = z.enum([
'url',
'data',
'bytes',
'file_handle',
]);
export const PromptAttachmentKindSchema = z.enum(['image', 'audio', 'file']);
const AttachmentProviderHintSchema = z
.object({
provider: z.nativeEnum(CopilotProviderType).optional(),
kind: PromptAttachmentKindSchema.optional(),
})
.strict();
const PromptAttachmentSchema = z.discriminatedUnion('kind', [
z
.object({
kind: z.literal('url'),
url: AttachmentUrlSchema,
mimeType: z.string().optional(),
fileName: z.string().optional(),
providerHint: AttachmentProviderHintSchema.optional(),
})
.strict(),
z
.object({
kind: z.literal('data'),
data: z.string(),
mimeType: z.string(),
encoding: z.enum(['base64', 'utf8']).optional(),
fileName: z.string().optional(),
providerHint: AttachmentProviderHintSchema.optional(),
})
.strict(),
z
.object({
kind: z.literal('bytes'),
data: z.string(),
mimeType: z.string(),
encoding: z.literal('base64').optional(),
fileName: z.string().optional(),
providerHint: AttachmentProviderHintSchema.optional(),
})
.strict(),
z
.object({
kind: z.literal('file_handle'),
fileHandle: z.string().trim().min(1),
mimeType: z.string().optional(),
fileName: z.string().optional(),
providerHint: AttachmentProviderHintSchema.optional(),
})
.strict(),
]);
export const ChatMessageAttachment = z.union([
z.string().url(),
AttachmentUrlSchema,
z.object({
attachment: z.string(),
attachment: AttachmentUrlSchema,
mimeType: z.string(),
}),
PromptAttachmentSchema,
]);
export const PromptResponseFormatSchema = z
.object({
type: z.literal('json_schema'),
schema: z.any(),
strict: z.boolean().optional(),
})
.strict();
export const StreamObjectSchema = z.discriminatedUnion('type', [
z.object({
type: z.literal('text-delta'),
@@ -161,6 +244,7 @@ export const PureMessageSchema = z.object({
streamObjects: z.array(StreamObjectSchema).optional().nullable(),
attachments: z.array(ChatMessageAttachment).optional().nullable(),
params: z.record(z.any()).optional().nullable(),
responseFormat: PromptResponseFormatSchema.optional().nullable(),
});
export const PromptMessageSchema = PureMessageSchema.extend({
@@ -169,6 +253,12 @@ export const PromptMessageSchema = PureMessageSchema.extend({
export type PromptMessage = z.infer<typeof PromptMessageSchema>;
export type PromptParams = NonNullable<PromptMessage['params']>;
export type StreamObject = z.infer<typeof StreamObjectSchema>;
export type PromptAttachment = z.infer<typeof ChatMessageAttachment>;
export type PromptAttachmentSourceKind = z.infer<
typeof PromptAttachmentSourceKindSchema
>;
export type PromptAttachmentKind = z.infer<typeof PromptAttachmentKindSchema>;
export type PromptResponseFormat = z.infer<typeof PromptResponseFormatSchema>;
// ========== options ==========
@@ -194,7 +284,9 @@ export type CopilotChatTools = NonNullable<
>[number];
export const CopilotStructuredOptionsSchema =
CopilotProviderOptionsSchema.merge(PromptConfigStrictSchema).optional();
CopilotProviderOptionsSchema.merge(PromptConfigStrictSchema)
.extend({ schema: z.any().optional(), strict: z.boolean().optional() })
.optional();
export type CopilotStructuredOptions = z.infer<
typeof CopilotStructuredOptionsSchema
@@ -220,10 +312,22 @@ export type CopilotEmbeddingOptions = z.infer<
typeof CopilotEmbeddingOptionsSchema
>;
export type CopilotRerankCandidate = {
id?: string;
text: string;
};
export type CopilotRerankRequest = {
query: string;
candidates: CopilotRerankCandidate[];
topK?: number;
};
export enum ModelInputType {
Text = 'text',
Image = 'image',
Audio = 'audio',
File = 'file',
}
export enum ModelOutputType {
@@ -231,12 +335,21 @@ export enum ModelOutputType {
Object = 'object',
Embedding = 'embedding',
Image = 'image',
Rerank = 'rerank',
Structured = 'structured',
}
export interface ModelAttachmentCapability {
kinds: PromptAttachmentKind[];
sourceKinds?: PromptAttachmentSourceKind[];
allowRemoteUrls?: boolean;
}
export interface ModelCapability {
input: ModelInputType[];
output: ModelOutputType[];
attachments?: ModelAttachmentCapability;
structuredAttachments?: ModelAttachmentCapability;
defaultForOutputType?: boolean;
}
@@ -248,6 +361,9 @@ export interface CopilotProviderModel {
export type ModelConditions = {
inputTypes?: ModelInputType[];
attachmentKinds?: PromptAttachmentKind[];
attachmentSourceKinds?: PromptAttachmentSourceKind[];
hasRemoteAttachments?: boolean;
modelId?: string;
};

View File

@@ -1,34 +1,39 @@
import { GoogleVertexProviderSettings } from '@ai-sdk/google-vertex';
import { GoogleVertexAnthropicProviderSettings } from '@ai-sdk/google-vertex/anthropic';
import { Logger } from '@nestjs/common';
import {
CoreAssistantMessage,
CoreUserMessage,
FilePart,
ImagePart,
TextPart,
TextStreamPart,
} from 'ai';
import { GoogleAuth, GoogleAuthOptions } from 'google-auth-library';
import z, { ZodType } from 'zod';
import z from 'zod';
import {
bufferToArrayBuffer,
fetchBuffer,
OneMinute,
ResponseTooLargeError,
safeFetch,
SsrfBlockedError,
} from '../../../base';
import { CustomAITools } from '../tools';
import { PromptMessage, StreamObject } from './types';
import { OneMinute, safeFetch } from '../../../base';
import { PromptAttachment, StreamObject } from './types';
type ChatMessage = CoreUserMessage | CoreAssistantMessage;
export type VertexProviderConfig = {
location?: string;
project?: string;
baseURL?: string;
googleAuthOptions?: GoogleAuthOptions;
fetch?: typeof fetch;
};
export type VertexAnthropicProviderConfig = VertexProviderConfig;
type CopilotTextStreamPart =
| { type: 'text-delta'; text: string; id?: string }
| { type: 'reasoning-delta'; text: string; id?: string }
| {
type: 'tool-call';
toolCallId: string;
toolName: string;
input: Record<string, unknown>;
}
| {
type: 'tool-result';
toolCallId: string;
toolName: string;
input: Record<string, unknown>;
output: unknown;
}
| { type: 'error'; error: unknown };
const ATTACHMENT_MAX_BYTES = 20 * 1024 * 1024;
const ATTACH_HEAD_PARAMS = { timeoutMs: OneMinute / 12, maxRedirects: 3 };
const SIMPLE_IMAGE_URL_REGEX = /^(https?:\/\/|data:image\/)/;
const FORMAT_INFER_MAP: Record<string, string> = {
pdf: 'application/pdf',
mp3: 'audio/mpeg',
@@ -53,9 +58,39 @@ const FORMAT_INFER_MAP: Record<string, string> = {
flv: 'video/flv',
};
async function fetchArrayBuffer(url: string): Promise<ArrayBuffer> {
const { buffer } = await fetchBuffer(url, ATTACHMENT_MAX_BYTES);
return bufferToArrayBuffer(buffer);
function toBase64Data(data: string, encoding: 'base64' | 'utf8' = 'base64') {
return encoding === 'base64'
? data
: Buffer.from(data, 'utf8').toString('base64');
}
export function promptAttachmentToUrl(
attachment: PromptAttachment
): string | undefined {
if (typeof attachment === 'string') return attachment;
if ('attachment' in attachment) return attachment.attachment;
switch (attachment.kind) {
case 'url':
return attachment.url;
case 'data':
return `data:${attachment.mimeType};base64,${toBase64Data(
attachment.data,
attachment.encoding
)}`;
case 'bytes':
return `data:${attachment.mimeType};base64,${attachment.data}`;
case 'file_handle':
return;
}
}
export function promptAttachmentMimeType(
attachment: PromptAttachment,
fallbackMimeType?: string
): string | undefined {
if (typeof attachment === 'string') return fallbackMimeType;
if ('attachment' in attachment) return attachment.mimeType;
return attachment.mimeType ?? fallbackMimeType;
}
export async function inferMimeType(url: string) {
@@ -69,346 +104,21 @@ export async function inferMimeType(url: string) {
if (ext) {
return ext;
}
try {
const mimeType = await safeFetch(
url,
{ method: 'HEAD' },
ATTACH_HEAD_PARAMS
).then(res => res.headers.get('content-type'));
if (mimeType) return mimeType;
} catch {
// ignore and fallback to default
}
}
try {
const mimeType = await safeFetch(
url,
{ method: 'HEAD' },
ATTACH_HEAD_PARAMS
).then(res => res.headers.get('content-type'));
if (mimeType) return mimeType;
} catch {
// ignore and fallback to default
}
return 'application/octet-stream';
}
export async function chatToGPTMessage(
messages: PromptMessage[],
// TODO(@darkskygit): move this logic in interface refactoring
withAttachment: boolean = true,
// NOTE: some providers in vercel ai sdk are not able to handle url attachments yet
// so we need to use base64 encoded attachments instead
useBase64Attachment: boolean = false
): Promise<[string | undefined, ChatMessage[], ZodType?]> {
const hasSystem = messages[0]?.role === 'system';
const system = hasSystem ? messages[0] : undefined;
const normalizedMessages = hasSystem ? messages.slice(1) : messages;
const schema =
system?.params?.schema && system.params.schema instanceof ZodType
? system.params.schema
: undefined;
// filter redundant fields
const msgs: ChatMessage[] = [];
for (let { role, content, attachments, params } of normalizedMessages.filter(
m => m.role !== 'system'
)) {
content = content.trim();
role = role as 'user' | 'assistant';
const mimetype = params?.mimetype;
if (Array.isArray(attachments)) {
const contents: (TextPart | ImagePart | FilePart)[] = [];
if (content.length) {
contents.push({ type: 'text', text: content });
}
if (withAttachment) {
for (let attachment of attachments) {
let mediaType: string;
if (typeof attachment === 'string') {
mediaType =
typeof mimetype === 'string'
? mimetype
: await inferMimeType(attachment);
} else {
({ attachment, mimeType: mediaType } = attachment);
}
if (SIMPLE_IMAGE_URL_REGEX.test(attachment)) {
const data =
attachment.startsWith('data:') || useBase64Attachment
? await fetchArrayBuffer(attachment).catch(error => {
// Avoid leaking internal details for blocked URLs.
if (
error instanceof SsrfBlockedError ||
error instanceof ResponseTooLargeError
) {
throw new Error('Attachment URL is not allowed');
}
throw error;
})
: new URL(attachment);
if (mediaType.startsWith('image/')) {
contents.push({ type: 'image', image: data, mediaType });
} else {
contents.push({ type: 'file' as const, data, mediaType });
}
}
}
} else if (!content.length) {
// temp fix for pplx
contents.push({ type: 'text', text: '[no content]' });
}
msgs.push({ role, content: contents } as ChatMessage);
} else {
msgs.push({ role, content });
}
}
return [system?.content, msgs, schema];
}
// pattern types the callback will receive
type Pattern =
| { kind: 'index'; value: number } // [123]
| { kind: 'link'; text: string; url: string } // [text](url)
| { kind: 'wrappedLink'; text: string; url: string }; // ([text](url))
type NeedMore = { kind: 'needMore' };
type Failed = { kind: 'fail'; nextPos: number };
type Finished =
| { kind: 'ok'; endPos: number; text: string; url: string }
| { kind: 'index'; endPos: number; value: number };
type ParseStatus = Finished | NeedMore | Failed;
type PatternCallback = (m: Pattern) => string;
export class StreamPatternParser {
#buffer = '';
constructor(private readonly callback: PatternCallback) {}
write(chunk: string): string {
this.#buffer += chunk;
const output: string[] = [];
let i = 0;
while (i < this.#buffer.length) {
const ch = this.#buffer[i];
// [[[number]]] or [text](url) or ([text](url))
if (ch === '[' || (ch === '(' && this.peek(i + 1) === '[')) {
const isWrapped = ch === '(';
const startPos = isWrapped ? i + 1 : i;
const res = this.tryParse(startPos);
if (res.kind === 'needMore') break;
const { output: out, nextPos } = this.handlePattern(
res,
isWrapped,
startPos,
i
);
output.push(out);
i = nextPos;
continue;
}
output.push(ch);
i += 1;
}
this.#buffer = this.#buffer.slice(i);
return output.join('');
}
end(): string {
const rest = this.#buffer;
this.#buffer = '';
return rest;
}
// =========== helpers ===========
private peek(pos: number): string | undefined {
return pos < this.#buffer.length ? this.#buffer[pos] : undefined;
}
private tryParse(pos: number): ParseStatus {
const nestedRes = this.tryParseNestedIndex(pos);
if (nestedRes) return nestedRes;
return this.tryParseBracketPattern(pos);
}
private tryParseNestedIndex(pos: number): ParseStatus | null {
if (this.peek(pos + 1) !== '[') return null;
let i = pos;
let bracketCount = 0;
while (i < this.#buffer.length && this.#buffer[i] === '[') {
bracketCount++;
i++;
}
if (bracketCount >= 2) {
if (i >= this.#buffer.length) {
return { kind: 'needMore' };
}
let content = '';
while (i < this.#buffer.length && this.#buffer[i] !== ']') {
content += this.#buffer[i++];
}
let rightBracketCount = 0;
while (i < this.#buffer.length && this.#buffer[i] === ']') {
rightBracketCount++;
i++;
}
if (i >= this.#buffer.length && rightBracketCount < bracketCount) {
return { kind: 'needMore' };
}
if (
rightBracketCount === bracketCount &&
content.length > 0 &&
this.isNumeric(content)
) {
if (this.peek(i) === '(') {
return { kind: 'fail', nextPos: i };
}
return { kind: 'index', endPos: i, value: Number(content) };
}
}
return null;
}
private tryParseBracketPattern(pos: number): ParseStatus {
let i = pos + 1; // skip '['
if (i >= this.#buffer.length) {
return { kind: 'needMore' };
}
let content = '';
while (i < this.#buffer.length && this.#buffer[i] !== ']') {
const nextChar = this.#buffer[i];
if (nextChar === '[') {
return { kind: 'fail', nextPos: i };
}
content += nextChar;
i += 1;
}
if (i >= this.#buffer.length) {
return { kind: 'needMore' };
}
const after = i + 1;
const afterChar = this.peek(after);
if (content.length > 0 && this.isNumeric(content) && afterChar !== '(') {
// [number] pattern
return { kind: 'index', endPos: after, value: Number(content) };
} else if (afterChar !== '(') {
// [text](url) pattern
return { kind: 'fail', nextPos: after };
}
i = after + 1; // skip '('
if (i >= this.#buffer.length) {
return { kind: 'needMore' };
}
let url = '';
while (i < this.#buffer.length && this.#buffer[i] !== ')') {
url += this.#buffer[i++];
}
if (i >= this.#buffer.length) {
return { kind: 'needMore' };
}
return { kind: 'ok', endPos: i + 1, text: content, url };
}
private isNumeric(str: string): boolean {
return !Number.isNaN(Number(str)) && str.trim() !== '';
}
private handlePattern(
pattern: Finished | Failed,
isWrapped: boolean,
start: number,
current: number
): { output: string; nextPos: number } {
if (pattern.kind === 'fail') {
return {
output: this.#buffer.slice(current, pattern.nextPos),
nextPos: pattern.nextPos,
};
}
if (isWrapped) {
const afterLinkPos = pattern.endPos;
if (this.peek(afterLinkPos) !== ')') {
if (afterLinkPos >= this.#buffer.length) {
return { output: '', nextPos: current };
}
return { output: '(', nextPos: start };
}
const out =
pattern.kind === 'index'
? this.callback({ ...pattern, kind: 'index' })
: this.callback({ ...pattern, kind: 'wrappedLink' });
return { output: out, nextPos: afterLinkPos + 1 };
} else {
const out =
pattern.kind === 'ok'
? this.callback({ ...pattern, kind: 'link' })
: this.callback({ ...pattern, kind: 'index' });
return { output: out, nextPos: pattern.endPos };
}
}
}
export class CitationParser {
private readonly citations: string[] = [];
private readonly parser = new StreamPatternParser(p => {
switch (p.kind) {
case 'index': {
if (p.value <= this.citations.length) {
return `[^${p.value}]`;
}
return `[${p.value}]`;
}
case 'wrappedLink': {
const index = this.citations.indexOf(p.url);
if (index === -1) {
this.citations.push(p.url);
return `[^${this.citations.length}]`;
}
return `[^${index + 1}]`;
}
case 'link': {
return `[${p.text}](${p.url})`;
}
}
});
public push(citation: string) {
this.citations.push(citation);
}
public parse(content: string) {
return this.parser.write(content);
}
public end() {
return this.parser.end() + '\n' + this.getFootnotes();
}
private getFootnotes() {
const footnotes = this.citations.map((citation, index) => {
return `[^${index + 1}]: {"type":"url","url":"${encodeURIComponent(
citation
)}"}`;
});
return footnotes.join('\n');
}
}
export type CitationIndexedEvent = {
type CitationIndexedEvent = {
type: 'citation';
index: number;
url: string;
@@ -436,7 +146,7 @@ export class CitationFootnoteFormatter {
}
}
type ChunkType = TextStreamPart<CustomAITools>['type'];
type ChunkType = CopilotTextStreamPart['type'];
export function toError(error: unknown): Error {
if (typeof error === 'string') {
@@ -458,6 +168,14 @@ type DocEditFootnote = {
intent: string;
result: string;
};
function asRecord(value: unknown): Record<string, unknown> | null {
if (value && typeof value === 'object' && !Array.isArray(value)) {
return value as Record<string, unknown>;
}
return null;
}
export class TextStreamParser {
private readonly logger = new Logger(TextStreamParser.name);
private readonly CALLOUT_PREFIX = '\n[!]\n';
@@ -468,7 +186,7 @@ export class TextStreamParser {
private readonly docEditFootnotes: DocEditFootnote[] = [];
public parse(chunk: TextStreamPart<CustomAITools>) {
public parse(chunk: CopilotTextStreamPart) {
let result = '';
switch (chunk.type) {
case 'text-delta': {
@@ -517,7 +235,7 @@ export class TextStreamParser {
}
case 'doc_edit': {
this.docEditFootnotes.push({
intent: chunk.input.instructions,
intent: String(chunk.input.instructions ?? ''),
result: '',
});
break;
@@ -533,14 +251,12 @@ export class TextStreamParser {
result = this.addPrefix(result);
switch (chunk.toolName) {
case 'doc_edit': {
const array =
chunk.output && typeof chunk.output === 'object'
? chunk.output.result
: undefined;
const output = asRecord(chunk.output);
const array = output?.result;
if (Array.isArray(array)) {
result += array
.map(item => {
return `\n${item.changedContent}\n`;
return `\n${String(asRecord(item)?.changedContent ?? '')}\n`;
})
.join('');
this.docEditFootnotes[this.docEditFootnotes.length - 1].result =
@@ -557,8 +273,11 @@ export class TextStreamParser {
} else if (typeof output === 'string') {
result += `\n${output}\n`;
} else {
const message = asRecord(output)?.message;
this.logger.warn(
`Unexpected result type for doc_semantic_search: ${output?.message || 'Unknown error'}`
`Unexpected result type for doc_semantic_search: ${
typeof message === 'string' ? message : 'Unknown error'
}`
);
}
break;
@@ -572,9 +291,11 @@ export class TextStreamParser {
break;
}
case 'doc_compose': {
const output = chunk.output;
if (output && typeof output === 'object' && 'title' in output) {
result += `\nDocument "${output.title}" created successfully with ${output.wordCount} words.\n`;
const output = asRecord(chunk.output);
if (output && typeof output.title === 'string') {
result += `\nDocument "${output.title}" created successfully with ${String(
output.wordCount ?? 0
)} words.\n`;
}
break;
}
@@ -654,7 +375,7 @@ export class TextStreamParser {
}
export class StreamObjectParser {
public parse(chunk: TextStreamPart<CustomAITools>) {
public parse(chunk: CopilotTextStreamPart) {
switch (chunk.type) {
case 'reasoning-delta': {
return { type: 'reasoning' as const, textDelta: chunk.text };
@@ -747,9 +468,7 @@ function normalizeUrl(baseURL?: string) {
}
}
export function getVertexAnthropicBaseUrl(
options: GoogleVertexAnthropicProviderSettings
) {
export function getVertexAnthropicBaseUrl(options: VertexProviderConfig) {
const normalizedBaseUrl = normalizeUrl(options.baseURL);
if (normalizedBaseUrl) return normalizedBaseUrl;
const { location, project } = options;
@@ -758,7 +477,7 @@ export function getVertexAnthropicBaseUrl(
}
export async function getGoogleAuth(
options: GoogleVertexAnthropicProviderSettings | GoogleVertexProviderSettings,
options: VertexProviderConfig,
publisher: 'anthropic' | 'google'
) {
function getBaseUrl() {
@@ -777,7 +496,7 @@ export async function getGoogleAuth(
}
const auth = new GoogleAuth({
scopes: ['https://www.googleapis.com/auth/cloud-platform'],
...(options.googleAuthOptions as GoogleAuthOptions),
...options.googleAuthOptions,
});
const client = await auth.getClient();
const token = await client.getAccessToken();

View File

@@ -31,6 +31,7 @@ import { SubscriptionPlan, SubscriptionStatus } from '../payment/types';
import { ChatMessageCache } from './message';
import { ChatPrompt } from './prompt/chat-prompt';
import { PromptService } from './prompt/service';
import { promptAttachmentHasSource } from './providers/attachments';
import { CopilotProviderFactory } from './providers/factory';
import { buildProviderRegistry } from './providers/provider-registry';
import {
@@ -38,6 +39,7 @@ import {
type PromptMessage,
type PromptParams,
} from './providers/types';
import { promptAttachmentToUrl } from './providers/utils';
import {
type ChatHistory,
type ChatMessage,
@@ -272,11 +274,7 @@ export class ChatSession implements AsyncDisposable {
lastMessage.attachments || [],
]
.flat()
.filter(v =>
typeof v === 'string'
? !!v.trim()
: v && v.attachment.trim() && v.mimeType
);
.filter(v => promptAttachmentHasSource(v));
//insert all previous user message content before first user message
finished.splice(firstUserMessageIndex, 0, ...messages);
@@ -466,8 +464,8 @@ export class ChatSessionService {
messages: preload.concat(messages).map(m => ({
...m,
attachments: m.attachments
?.map(a => (typeof a === 'string' ? a : a.attachment))
.filter(a => !!a),
?.map(a => promptAttachmentToUrl(a))
.filter((a): a is string => !!a),
})),
};
} else {

View File

@@ -1,9 +1,9 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { AccessController } from '../../../core/permission';
import { toolError } from './error';
import { defineTool } from './tool';
import type { ContextSession, CopilotChatOptions } from './types';
const logger = new Logger('ContextBlobReadTool');
@@ -58,7 +58,7 @@ export const createBlobReadTool = (
chunk?: number
) => Promise<object | undefined>
) => {
return tool({
return defineTool({
description:
'Return the content and basic metadata of a single attachment identified by blobId; more inclined to use search tools rather than this tool.',
inputSchema: z.object({

View File

@@ -1,8 +1,8 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotProviderFactory, PromptService } from './types';
const logger = new Logger('CodeArtifactTool');
@@ -16,7 +16,7 @@ export const createCodeArtifactTool = (
promptService: PromptService,
factory: CopilotProviderFactory
) => {
return tool({
return defineTool({
description:
'Generate a single-file HTML snippet (with inline <style> and <script>) that accomplishes the requested functionality. The final HTML should be runnable when saved as an .html file and opened in a browser. Do NOT reference external resources (CSS, JS, images) except through data URIs.',
inputSchema: z.object({

View File

@@ -1,8 +1,8 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotProviderFactory, PromptService } from './types';
const logger = new Logger('ConversationSummaryTool');
@@ -12,7 +12,7 @@ export const createConversationSummaryTool = (
promptService: PromptService,
factory: CopilotProviderFactory
) => {
return tool({
return defineTool({
description:
'Create a concise, AI-generated summary of the conversation so far—capturing key topics, decisions, and critical details. Use this tool whenever the context becomes lengthy to preserve essential information that might otherwise be lost to truncation in future turns.',
inputSchema: z.object({

View File

@@ -1,8 +1,8 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotProviderFactory, PromptService } from './types';
const logger = new Logger('DocComposeTool');
@@ -11,7 +11,7 @@ export const createDocComposeTool = (
promptService: PromptService,
factory: CopilotProviderFactory
) => {
return tool({
return defineTool({
description:
'Write a new document with markdown content. This tool creates structured markdown content for documents including titles, sections, and formatting.',
inputSchema: z.object({

View File

@@ -1,8 +1,8 @@
import { tool } from 'ai';
import { z } from 'zod';
import { DocReader } from '../../../core/doc';
import { AccessController } from '../../../core/permission';
import { defineTool } from './tool';
import type {
CopilotChatOptions,
CopilotProviderFactory,
@@ -50,7 +50,7 @@ export const createDocEditTool = (
prompt: PromptService,
getContent: (targetId?: string) => Promise<string | undefined>
) => {
return tool({
return defineTool({
description: `
Use this tool to propose an edit to a structured Markdown document with identifiable blocks.
Each block begins with a comment like <!-- block_id=... -->, and represents a unit of editable content such as a heading, paragraph, list, or code snippet.

View File

@@ -1,9 +1,9 @@
import { tool } from 'ai';
import { z } from 'zod';
import type { AccessController } from '../../../core/permission';
import type { IndexerService, SearchDoc } from '../../indexer';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotChatOptions } from './types';
export const buildDocKeywordSearchGetter = (
@@ -37,7 +37,7 @@ export const buildDocKeywordSearchGetter = (
export const createDocKeywordSearchTool = (
searchDocs: (query: string) => Promise<SearchDoc[] | undefined>
) => {
return tool({
return defineTool({
description:
'Fuzzy search all workspace documents for the exact keyword or phrase supplied and return passages ranked by textual match. Use this tool by default whenever a straightforward term-based or keyword-base lookup is sufficient.',
inputSchema: z.object({

View File

@@ -1,11 +1,11 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { DocReader } from '../../../core/doc';
import { AccessController } from '../../../core/permission';
import { Models, publicUserSelect } from '../../../models';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotChatOptions } from './types';
const logger = new Logger('DocReadTool');
@@ -72,7 +72,7 @@ export const buildDocContentGetter = (
export const createDocReadTool = (
getDoc: (targetId?: string) => Promise<object | undefined>
) => {
return tool({
return defineTool({
description:
'Return the complete text and basic metadata of a single document identified by docId; use this when the user needs the full content of a specific file rather than a search result.',
inputSchema: z.object({

View File

@@ -1,4 +1,3 @@
import { tool } from 'ai';
import { omit } from 'lodash-es';
import { z } from 'zod';
@@ -9,6 +8,7 @@ import {
type Models,
} from '../../../models';
import { toolError } from './error';
import { defineTool } from './tool';
import type {
ContextSession,
CopilotChatOptions,
@@ -24,7 +24,7 @@ export const buildDocSearchGetter = (
const searchDocs = async (
options: CopilotChatOptions,
query?: string,
abortSignal?: AbortSignal
signal?: AbortSignal
) => {
if (!options || !query?.trim() || !options.user || !options.workspace) {
return `Invalid search parameters.`;
@@ -36,8 +36,8 @@ export const buildDocSearchGetter = (
if (!canAccess)
return 'You do not have permission to access this workspace.';
const [chunks, contextChunks] = await Promise.all([
context.matchWorkspaceAll(options.workspace, query, 10, abortSignal),
docContext?.matchFiles(query, 10, abortSignal) ?? [],
context.matchWorkspaceAll(options.workspace, query, 10, signal),
docContext?.matchFiles(query, 10, signal) ?? [],
]);
const docChunks = await ac
@@ -100,10 +100,10 @@ export const buildDocSearchGetter = (
export const createDocSemanticSearchTool = (
searchDocs: (
query: string,
abortSignal?: AbortSignal
signal?: AbortSignal
) => Promise<ChunkSimilarity[] | string | undefined>
) => {
return tool({
return defineTool({
description:
'Retrieve conceptually related passages by performing vector-based semantic similarity search across embedded documents; use this tool only when exact keyword search fails or the user explicitly needs meaning-level matches (e.g., paraphrases, synonyms, broader concepts, recent documents).',
inputSchema: z.object({
@@ -115,7 +115,7 @@ export const createDocSemanticSearchTool = (
}),
execute: async ({ query }, options) => {
try {
return await searchDocs(query, options.abortSignal);
return await searchDocs(query, options.signal);
} catch (e: any) {
return toolError('Doc Semantic Search Failed', e.message);
}

View File

@@ -1,10 +1,10 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { DocWriter } from '../../../core/doc';
import { AccessController } from '../../../core/permission';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotChatOptions } from './types';
const logger = new Logger('DocWriteTool');
@@ -141,7 +141,7 @@ export const buildDocUpdateMetaHandler = (
export const createDocCreateTool = (
createDoc: (title: string, content: string) => Promise<object>
) => {
return tool({
return defineTool({
description:
'Create a new document in the workspace with the given title and markdown content. Returns the ID of the created document. This tool not support insert or update database block and image yet.',
inputSchema: z.object({
@@ -164,7 +164,7 @@ export const createDocCreateTool = (
export const createDocUpdateTool = (
updateDoc: (docId: string, content: string) => Promise<object>
) => {
return tool({
return defineTool({
description:
'Update an existing document with new markdown content (body only). Uses structural diffing to apply minimal changes. This does NOT update the document title. This tool not support insert or update database block and image yet.',
inputSchema: z.object({
@@ -189,7 +189,7 @@ export const createDocUpdateTool = (
export const createDocUpdateMetaTool = (
updateDocMeta: (docId: string, title: string) => Promise<object>
) => {
return tool({
return defineTool({
description: 'Update document metadata (currently title only).',
inputSchema: z.object({
doc_id: z.string().describe('The ID of the document to update'),

View File

@@ -1,12 +1,12 @@
import { tool } from 'ai';
import Exa from 'exa-js';
import { z } from 'zod';
import { Config } from '../../../base';
import { toolError } from './error';
import { defineTool } from './tool';
export const createExaCrawlTool = (config: Config) => {
return tool({
return defineTool({
description: 'Crawl the web url for information',
inputSchema: z.object({
url: z

View File

@@ -1,12 +1,12 @@
import { tool } from 'ai';
import Exa from 'exa-js';
import { z } from 'zod';
import { Config } from '../../../base';
import { toolError } from './error';
import { defineTool } from './tool';
export const createExaSearchTool = (config: Config) => {
return tool({
return defineTool({
description: 'Search the web for information',
inputSchema: z.object({
query: z.string().describe('The query to search the web for.'),

View File

@@ -1,39 +1,3 @@
import { ToolSet } from 'ai';
import { createBlobReadTool } from './blob-read';
import { createCodeArtifactTool } from './code-artifact';
import { createConversationSummaryTool } from './conversation-summary';
import { createDocComposeTool } from './doc-compose';
import { createDocEditTool } from './doc-edit';
import { createDocKeywordSearchTool } from './doc-keyword-search';
import { createDocReadTool } from './doc-read';
import { createDocSemanticSearchTool } from './doc-semantic-search';
import {
createDocCreateTool,
createDocUpdateMetaTool,
createDocUpdateTool,
} from './doc-write';
import { createExaCrawlTool } from './exa-crawl';
import { createExaSearchTool } from './exa-search';
import { createSectionEditTool } from './section-edit';
export interface CustomAITools extends ToolSet {
blob_read: ReturnType<typeof createBlobReadTool>;
code_artifact: ReturnType<typeof createCodeArtifactTool>;
conversation_summary: ReturnType<typeof createConversationSummaryTool>;
doc_edit: ReturnType<typeof createDocEditTool>;
doc_semantic_search: ReturnType<typeof createDocSemanticSearchTool>;
doc_keyword_search: ReturnType<typeof createDocKeywordSearchTool>;
doc_read: ReturnType<typeof createDocReadTool>;
doc_create: ReturnType<typeof createDocCreateTool>;
doc_update: ReturnType<typeof createDocUpdateTool>;
doc_update_meta: ReturnType<typeof createDocUpdateMetaTool>;
doc_compose: ReturnType<typeof createDocComposeTool>;
section_edit: ReturnType<typeof createSectionEditTool>;
web_search_exa: ReturnType<typeof createExaSearchTool>;
web_crawl_exa: ReturnType<typeof createExaCrawlTool>;
}
export * from './blob-read';
export * from './code-artifact';
export * from './conversation-summary';
@@ -47,3 +11,4 @@ export * from './error';
export * from './exa-crawl';
export * from './exa-search';
export * from './section-edit';
export * from './tool';

View File

@@ -1,8 +1,8 @@
import { Logger } from '@nestjs/common';
import { tool } from 'ai';
import { z } from 'zod';
import { toolError } from './error';
import { defineTool } from './tool';
import type { CopilotProviderFactory, PromptService } from './types';
const logger = new Logger('SectionEditTool');
@@ -11,7 +11,7 @@ export const createSectionEditTool = (
promptService: PromptService,
factory: CopilotProviderFactory
) => {
return tool({
return defineTool({
description:
'Intelligently edit and modify a specific section of a document based on user instructions, with full document context awareness. This tool can refine, rewrite, translate, restructure, or enhance any part of markdown content while preserving formatting, maintaining contextual coherence, and ensuring consistency with the entire document. Perfect for targeted improvements that consider the broader document context.',
inputSchema: z.object({

View File

@@ -0,0 +1,33 @@
import type { ZodTypeAny } from 'zod';
import { z } from 'zod';
import type { PromptMessage } from '../providers/types';
export type CopilotToolExecuteOptions = {
signal?: AbortSignal;
messages?: PromptMessage[];
};
export type CopilotTool = {
description?: string;
inputSchema?: ZodTypeAny | Record<string, unknown>;
execute?: {
bivarianceHack: (
args: Record<string, unknown>,
options: CopilotToolExecuteOptions
) => Promise<unknown> | unknown;
}['bivarianceHack'];
};
export type CopilotToolSet = Record<string, CopilotTool>;
export function defineTool<TSchema extends ZodTypeAny, TResult>(tool: {
description?: string;
inputSchema: TSchema;
execute: (
args: z.infer<TSchema>,
options: CopilotToolExecuteOptions
) => Promise<TResult> | TResult;
}): CopilotTool {
return tool;
}

View File

@@ -224,11 +224,10 @@ export class CopilotTranscriptionService {
const config = Object.assign({}, prompt.config);
if (schema) {
const provider = await this.getProvider(prompt.model, true, prefer);
return provider.structure(
cond,
[...prompt.finish({ schema }), msg],
config
);
return provider.structure(cond, [...prompt.finish({}), msg], {
...config,
schema,
});
} else {
const provider = await this.getProvider(prompt.model, false);
return provider.text(cond, [...prompt.finish({}), msg], config);

View File

@@ -37,7 +37,9 @@ const OIDCUserInfoSchema = z
preferred_username: z.string().optional(),
email: z.string().email(),
name: z.string().optional(),
email_verified: z.boolean().optional(),
email_verified: z
.union([z.boolean(), z.enum(['true', 'false', '1', '0', 'yes', 'no'])])
.optional(),
groups: z.array(z.string()).optional(),
})
.passthrough();

View File

@@ -12,10 +12,10 @@
},
"sideEffects": false,
"devDependencies": {
"@graphql-codegen/add": "^5.0.3",
"@graphql-codegen/cli": "^5.0.7",
"@graphql-codegen/typescript": "^4.1.6",
"@graphql-codegen/typescript-operations": "^4.6.1",
"@graphql-codegen/add": "^6.0.0",
"@graphql-codegen/cli": "^6.1.3",
"@graphql-codegen/typescript": "^5.0.9",
"@graphql-codegen/typescript-operations": "^5.0.9",
"@types/lodash-es": "^4.17.12",
"prettier": "^3.7.4",
"vitest": "^4.0.18"

View File

@@ -113,8 +113,8 @@
"kind" : "remoteSourceControl",
"location" : "https://github.com/apple/swift-collections",
"state" : {
"revision" : "7b847a3b7008b2dc2f47ca3110d8c782fb2e5c7e",
"version" : "1.3.0"
"revision" : "8d9834a6189db730f6264db7556a7ffb751e99ee",
"version" : "1.4.0"
}
},
{

View File

@@ -17,7 +17,7 @@ let package = Package(
.package(path: "../AffineGraphQL"),
.package(path: "../AffineResources"),
.package(url: "https://github.com/apollographql/apollo-ios.git", from: "1.23.0"),
.package(url: "https://github.com/apple/swift-collections.git", from: "1.3.0"),
.package(url: "https://github.com/apple/swift-collections.git", from: "1.4.0"),
.package(url: "https://github.com/SnapKit/SnapKit.git", from: "5.7.1"),
.package(url: "https://github.com/SwifterSwift/SwifterSwift.git", from: "6.2.0"),
.package(url: "https://github.com/Recouse/EventSource.git", from: "0.1.7"),

View File

@@ -26,10 +26,17 @@ import {
ThinkingIcon,
} from '@blocksuite/icons/lit';
import { ShadowlessElement } from '@blocksuite/std';
import { autoPlacement, offset, shift } from '@floating-ui/dom';
import { computed } from '@preact/signals-core';
import { css, html } from 'lit';
import { property } from 'lit/decorators.js';
const modelSubMenuMiddleware = [
autoPlacement({ allowedPlacements: ['right-start', 'left-start'] }),
offset({ mainAxis: 4, crossAxis: 0 }),
shift({ crossAxis: true, padding: 8 }),
];
export class ChatInputPreference extends SignalWatcher(
WithDisposable(ShadowlessElement)
) {
@@ -140,6 +147,7 @@ export class ChatInputPreference extends SignalWatcher(
menu.subMenu({
name: 'Model',
prefix: AiOutlineIcon(),
middleware: modelSubMenuMiddleware,
postfix: html`
<span class="ai-active-model-name"> ${this.model.value?.name} </span>
`,

View File

@@ -99,4 +99,69 @@ describe('markdownToMindmap: convert markdown list to a mind map tree', () => {
expect(nodes).toEqual(null);
});
test('accepts leading plain text before the markdown list', () => {
const markdown = `Here is the regenerated mind map:
- Text A
- Text B`;
const collection = new TestWorkspace();
collection.meta.initialize();
const doc = collection.createDoc().getStore();
const nodes = markdownToMindmap(markdown, doc, provider);
expect(nodes).toEqual({
text: 'Text A',
children: [
{
text: 'Text B',
children: [],
},
],
});
});
test('accepts markdown lists wrapped in a code block', () => {
const markdown = `\`\`\`markdown
- Text A
- Text B
\`\`\``;
const collection = new TestWorkspace();
collection.meta.initialize();
const doc = collection.createDoc().getStore();
const nodes = markdownToMindmap(markdown, doc, provider);
expect(nodes).toEqual({
text: 'Text A',
children: [
{
text: 'Text B',
children: [],
},
],
});
});
test('keeps inline markdown content inside node labels', () => {
const markdown = `
- Root with [link](https://example.com) and [^1]
- Child with \`code\`
[^1]: footnote
`;
const collection = new TestWorkspace();
collection.meta.initialize();
const doc = collection.createDoc().getStore();
const nodes = markdownToMindmap(markdown, doc, provider);
expect(nodes).toEqual({
text: 'Root with link and',
children: [
{
text: 'Child with code',
children: [],
},
],
});
});
});

View File

@@ -19,7 +19,7 @@ import { css, html, LitElement, nothing } from 'lit';
import { property, query } from 'lit/decorators.js';
import { repeat } from 'lit/directives/repeat.js';
import { styleMap } from 'lit/directives/style-map.js';
import type { Root } from 'mdast';
import type { Root, RootContent } from 'mdast';
import { Doc as YDoc } from 'yjs';
import { MiniMindmapSchema, MiniMindmapSpecs } from './spec.js';
@@ -234,19 +234,68 @@ type Node = {
children: Node[];
};
type MarkdownNode =
| RootContent
| { alt?: string | null; children?: MarkdownNode[]; value?: string };
export const markdownToMindmap = (
answer: string,
doc: Store,
provider: ServiceProvider
) => {
let result: Node | null = null;
const transformer = doc.getTransformer();
const markdown = new MarkdownAdapter(transformer, provider);
const ast: Root = markdown['_markdownToAst'](answer);
const astToMindmap = (ast: Root): Node | null => {
const findList = (
nodes: Root['children']
): Unpacked<Root['children']> | null => {
for (const node of nodes) {
if (node.type === 'list') {
return node;
}
if (node.type === 'code' && node.value) {
const nestedAst: Root = markdown['_markdownToAst'](node.value);
const nestedList = findList(nestedAst.children);
if (nestedList) {
return nestedList;
}
}
}
return null;
};
const list = findList(ast.children);
if (!list) {
return null;
}
return traverse(list, true);
};
const traverse = (
markdownNode: Unpacked<(typeof ast)['children']>,
markdownNode: Unpacked<Root['children']>,
firstLevel = false
): Node | null => {
const toPlainText = (node: MarkdownNode): string => {
if ('value' in node && typeof node.value === 'string') {
return node.value;
}
if ('alt' in node && typeof node.alt === 'string') {
return node.alt;
}
if ('children' in node && Array.isArray(node.children)) {
return node.children
.map((child: MarkdownNode) => toPlainText(child))
.join('');
}
return '';
};
switch (markdownNode.type) {
case 'list':
{
@@ -267,11 +316,11 @@ export const markdownToMindmap = (
children: [],
};
if (
paragraph?.type === 'paragraph' &&
paragraph.children[0]?.type === 'text'
) {
node.text = paragraph.children[0].value;
if (paragraph?.type === 'paragraph') {
node.text = paragraph.children
.map((child: MarkdownNode) => toPlainText(child))
.join('')
.trim();
}
if (list?.type === 'list') {
@@ -287,9 +336,5 @@ export const markdownToMindmap = (
return null;
};
if (ast?.children?.[0]?.type === 'list') {
result = traverse(ast.children[0], true);
}
return result;
return astToMindmap(markdown['_markdownToAst'](answer));
};

View File

@@ -12,6 +12,7 @@ import {
ThemeService,
} from '@blocksuite/affine/shared/services';
import { BlockViewExtension, FlavourExtension } from '@blocksuite/affine/std';
import { ToolController } from '@blocksuite/affine/std/gfx';
import type { BlockSchema, ExtensionType } from '@blocksuite/affine/store';
import { literal } from 'lit/static-html.js';
import type { z } from 'zod';
@@ -24,6 +25,7 @@ export const MiniMindmapSpecs: ExtensionType[] = [
ThemeService,
FlavourExtension('affine:page'),
MindmapService,
ToolController,
BlockViewExtension('affine:page', literal`mini-mindmap-root-block`),
FlavourExtension('affine:surface'),
MindMapView,

View File

@@ -4,7 +4,7 @@ import {
type UploadFileResponse,
} from '@google/generative-ai/server';
const DEFAULT_MODEL = 'gemini-2.0-flash';
const DEFAULT_MODEL = 'gemini-2.5-pro';
export interface TranscriptionResult {
title: string;
@@ -75,7 +75,7 @@ Output in JSON format with the following structure:
export async function gemini(
audioFilePath: string,
options?: {
model?: 'gemini-2.0-flash' | 'gemini-1.5-flash';
model?: 'gemini-2.5-flash' | 'gemini-2.5-pro';
mode?: 'transcript' | 'summary';
}
) {

View File

@@ -3,8 +3,13 @@
*/
import { beforeEach, describe, expect, test, vi } from 'vitest';
const sendTelemetryEvent = vi.fn().mockResolvedValue({ queued: true });
const setTelemetryContext = vi.fn();
import { resetTrackerState } from '../state';
import { tracker } from '../tracker';
const { sendTelemetryEvent, setTelemetryContext } = vi.hoisted(() => ({
sendTelemetryEvent: vi.fn().mockResolvedValue({ queued: true }),
setTelemetryContext: vi.fn(),
}));
vi.mock('../telemetry', () => ({
sendTelemetryEvent,
@@ -27,17 +32,11 @@ beforeEach(() => {
sendTelemetryEvent.mockClear();
setTelemetryContext.mockClear();
vi.useRealTimers();
vi.resetModules();
resetTrackerState();
});
async function loadTracker() {
return await import('../tracker');
}
describe('tracker session signals', () => {
test('sends first_visit and session_start on first event', async () => {
const { tracker } = await loadTracker();
test('sends first_visit and session_start on first event', () => {
tracker.track('test_event');
const events = sendTelemetryEvent.mock.calls.map(call => call[0]);
@@ -48,14 +47,12 @@ describe('tracker session signals', () => {
]);
const firstVisit = events[0];
expect(typeof (firstVisit.params as any).session_id).toBe('number');
expect((firstVisit.params as any).session_number).toBe(1);
expect((firstVisit.params as any).engagement_time_msec).toBe(1);
expect(typeof firstVisit.params?.session_id).toBe('number');
expect(firstVisit.params?.session_number).toBe(1);
expect(firstVisit.params?.engagement_time_msec).toBe(1);
});
test('does not repeat first_visit for later events', async () => {
const { tracker } = await loadTracker();
test('does not repeat first_visit for later events', () => {
tracker.track('event_a');
tracker.track('event_b');
@@ -64,10 +61,9 @@ describe('tracker session signals', () => {
expect(names.filter(name => name === 'session_start')).toHaveLength(1);
});
test('increments session_number after idle timeout', async () => {
test('increments session_number after idle timeout', () => {
vi.useFakeTimers();
vi.setSystemTime(new Date('2024-01-01T00:00:00Z'));
const { tracker } = await loadTracker();
tracker.track('event_a');
sendTelemetryEvent.mockClear();

View File

@@ -0,0 +1,99 @@
import { nanoid } from 'nanoid';
export type TrackProperties = Record<string, unknown> | undefined;
export type Middleware = (
name: string,
properties?: TrackProperties
) => Record<string, unknown>;
type TrackerState = {
enabled: boolean;
clientStorage: Storage | null;
clientId: string;
pendingFirstVisit: boolean;
sessionId: number;
sessionNumber: number;
lastActivityMs: number;
sessionStartSent: boolean;
engagementTrackingEnabled: boolean;
visibleSinceMs: number | null;
pendingEngagementMs: number;
visibilityChangeHandler: (() => void) | null;
pageHideHandler: (() => void) | null;
userId: string | undefined;
userProperties: Record<string, unknown>;
middlewares: Set<Middleware>;
};
const CLIENT_ID_KEY = 'affine_telemetry_client_id';
export let trackerState = createTrackerState();
export function resetTrackerState() {
cleanupTrackerState(trackerState);
trackerState = createTrackerState();
}
function createTrackerState(): TrackerState {
const clientStorage = localStorageSafe();
const hasClientId = !!clientStorage?.getItem(CLIENT_ID_KEY);
return {
enabled: true,
clientStorage,
clientId: readPersistentId(CLIENT_ID_KEY, clientStorage),
pendingFirstVisit: !hasClientId,
sessionId: 0,
sessionNumber: 0,
lastActivityMs: 0,
sessionStartSent: false,
engagementTrackingEnabled: false,
visibleSinceMs: null,
pendingEngagementMs: 0,
visibilityChangeHandler: null,
pageHideHandler: null,
userId: undefined,
userProperties: {},
middlewares: new Set<Middleware>(),
};
}
function cleanupTrackerState(state: TrackerState) {
if (state.visibilityChangeHandler && typeof document !== 'undefined') {
document.removeEventListener(
'visibilitychange',
state.visibilityChangeHandler
);
}
if (state.pageHideHandler && typeof window !== 'undefined') {
window.removeEventListener('pagehide', state.pageHideHandler);
}
}
function readPersistentId(key: string, storage: Storage | null, renew = false) {
if (!storage) {
return nanoid();
}
if (!renew) {
const existing = storage.getItem(key);
if (existing) {
return existing;
}
}
const id = nanoid();
try {
storage.setItem(key, id);
} catch {
return id;
}
return id;
}
function localStorageSafe(): Storage | null {
try {
return typeof localStorage === 'undefined' ? null : localStorage;
} catch {
return null;
}
}

View File

@@ -1,43 +1,20 @@
import { DebugLogger } from '@affine/debug';
import { nanoid } from 'nanoid';
import { type Middleware, trackerState, type TrackProperties } from './state';
import type { TelemetryEvent } from './telemetry';
import { sendTelemetryEvent, setTelemetryContext } from './telemetry';
const logger = new DebugLogger('telemetry');
type TrackProperties = Record<string, unknown> | undefined;
type RawTrackProperties = Record<string, unknown> | object | undefined;
type Middleware = (
name: string,
properties?: TrackProperties
) => Record<string, unknown>;
const CLIENT_ID_KEY = 'affine_telemetry_client_id';
const SESSION_ID_KEY = 'affine_telemetry_session_id';
const SESSION_NUMBER_KEY = 'affine_telemetry_session_number';
const SESSION_NUMBER_CURRENT_KEY = 'affine_telemetry_session_number_current';
const LAST_ACTIVITY_KEY = 'affine_telemetry_last_activity_ms';
const SESSION_TIMEOUT_MS = 30 * 60 * 1000;
let enabled = true;
const clientStorage = localStorageSafe();
const hasClientId = clientStorage?.getItem(CLIENT_ID_KEY);
let clientId = readPersistentId(CLIENT_ID_KEY, clientStorage);
let pendingFirstVisit = !hasClientId;
let sessionId = 0;
let sessionNumber = 0;
let lastActivityMs = 0;
let sessionStartSent = false;
let engagementTrackingEnabled = false;
let visibleSinceMs: number | null = null;
let pendingEngagementMs = 0;
let userId: string | undefined;
let userProperties: Record<string, unknown> = {};
const middlewares = new Set<Middleware>();
export const tracker = {
init() {
this.register({
@@ -51,29 +28,32 @@ export const tracker = {
},
register(props: Record<string, unknown>) {
userProperties = {
...userProperties,
trackerState.userProperties = {
...trackerState.userProperties,
...props,
};
setTelemetryContext({ userProperties });
setTelemetryContext({ userProperties: trackerState.userProperties });
},
reset() {
userId = undefined;
userProperties = {};
trackerState.userId = undefined;
trackerState.userProperties = {};
startNewSession(Date.now(), sessionStorageSafe());
setTelemetryContext(
{ userId, userProperties },
{
userId: trackerState.userId,
userProperties: trackerState.userProperties,
},
{ replaceUserProperties: true }
);
this.init();
},
track(eventName: string, properties?: RawTrackProperties) {
if (!enabled) {
if (!trackerState.enabled) {
return;
}
const middlewareProperties = Array.from(middlewares).reduce(
const middlewareProperties = Array.from(trackerState.middlewares).reduce(
(acc, middleware) => {
return middleware(eventName, acc);
},
@@ -84,10 +64,10 @@ export const tracker = {
},
track_pageview(properties?: { location?: string; [key: string]: unknown }) {
if (!enabled) {
if (!trackerState.enabled) {
return;
}
const middlewareProperties = Array.from(middlewares).reduce(
const middlewareProperties = Array.from(trackerState.middlewares).reduce(
(acc, middleware) => {
return middleware('track_pageview', acc);
},
@@ -108,41 +88,41 @@ export const tracker = {
},
middleware(cb: Middleware): () => void {
middlewares.add(cb);
trackerState.middlewares.add(cb);
return () => {
middlewares.delete(cb);
trackerState.middlewares.delete(cb);
};
},
opt_out_tracking() {
enabled = false;
trackerState.enabled = false;
},
opt_in_tracking() {
enabled = true;
trackerState.enabled = true;
},
has_opted_in_tracking() {
return enabled;
return trackerState.enabled;
},
has_opted_out_tracking() {
return !enabled;
return !trackerState.enabled;
},
identify(nextUserId?: string) {
userId = nextUserId ? String(nextUserId) : undefined;
setTelemetryContext({ userId });
trackerState.userId = nextUserId ? String(nextUserId) : undefined;
setTelemetryContext({ userId: trackerState.userId });
},
get people() {
return {
set: (props: Record<string, unknown>) => {
userProperties = {
...userProperties,
trackerState.userProperties = {
...trackerState.userProperties,
...props,
};
setTelemetryContext({ userProperties });
setTelemetryContext({ userProperties: trackerState.userProperties });
},
};
},
@@ -193,45 +173,62 @@ function prepareSession(now: number) {
if (expired) {
startNewSession(now, sessionStorage);
} else {
sessionId = storedSessionId;
sessionNumber = readCurrentSessionNumber(sessionStorage, clientStorage);
trackerState.sessionId = storedSessionId;
trackerState.sessionNumber = readCurrentSessionNumber(
sessionStorage,
trackerState.clientStorage
);
updateLastActivity(now, sessionStorage);
}
} else {
const expired =
!sessionId ||
!lastActivityMs ||
now - lastActivityMs > SESSION_TIMEOUT_MS;
!trackerState.sessionId ||
!trackerState.lastActivityMs ||
now - trackerState.lastActivityMs > SESSION_TIMEOUT_MS;
if (expired) {
startNewSession(now, null);
} else {
lastActivityMs = now;
if (!sessionNumber) {
sessionNumber = 1;
trackerState.lastActivityMs = now;
if (!trackerState.sessionNumber) {
trackerState.sessionNumber = 1;
}
}
}
const preEvents: TelemetryEvent[] = [];
if (pendingFirstVisit) {
pendingFirstVisit = false;
if (trackerState.pendingFirstVisit) {
trackerState.pendingFirstVisit = false;
preEvents.push(
buildEvent(
'first_visit',
mergeSessionParams({}, sessionId, sessionNumber, 1)
mergeSessionParams(
{},
trackerState.sessionId,
trackerState.sessionNumber,
1
)
)
);
}
if (!sessionStartSent) {
sessionStartSent = true;
if (!trackerState.sessionStartSent) {
trackerState.sessionStartSent = true;
preEvents.push(
buildEvent(
'session_start',
mergeSessionParams({}, sessionId, sessionNumber, 1)
mergeSessionParams(
{},
trackerState.sessionId,
trackerState.sessionNumber,
1
)
)
);
}
return { sessionId, sessionNumber, preEvents };
return {
sessionId: trackerState.sessionId,
sessionNumber: trackerState.sessionNumber,
preEvents,
};
}
function mergeSessionParams(
@@ -256,62 +253,76 @@ function mergeSessionParams(
}
function startNewSession(now: number, sessionStorage: Storage | null) {
sessionId = Math.floor(now / 1000);
sessionNumber = incrementSessionNumber(clientStorage, sessionStorage);
trackerState.sessionId = Math.floor(now / 1000);
trackerState.sessionNumber = incrementSessionNumber(
trackerState.clientStorage,
sessionStorage
);
updateLastActivity(now, sessionStorage);
writeNumber(sessionStorage, SESSION_ID_KEY, sessionId);
sessionStartSent = false;
writeNumber(sessionStorage, SESSION_ID_KEY, trackerState.sessionId);
trackerState.sessionStartSent = false;
resetEngagementState(now);
}
function updateLastActivity(now: number, sessionStorage: Storage | null) {
lastActivityMs = now;
trackerState.lastActivityMs = now;
writeNumber(sessionStorage, LAST_ACTIVITY_KEY, now);
}
function consumeEngagementTime(now: number) {
initEngagementTracking(now);
if (visibleSinceMs !== null) {
pendingEngagementMs += now - visibleSinceMs;
visibleSinceMs = now;
if (trackerState.visibleSinceMs !== null) {
trackerState.pendingEngagementMs += now - trackerState.visibleSinceMs;
trackerState.visibleSinceMs = now;
}
const engagementMs = Math.max(0, Math.round(pendingEngagementMs));
pendingEngagementMs = 0;
const engagementMs = Math.max(
0,
Math.round(trackerState.pendingEngagementMs)
);
trackerState.pendingEngagementMs = 0;
return engagementMs;
}
function resetEngagementState(now: number) {
pendingEngagementMs = 0;
visibleSinceMs = isDocumentVisible() ? now : null;
trackerState.pendingEngagementMs = 0;
trackerState.visibleSinceMs = isDocumentVisible() ? now : null;
}
function initEngagementTracking(now: number) {
if (engagementTrackingEnabled || typeof document === 'undefined') {
if (
trackerState.engagementTrackingEnabled ||
typeof document === 'undefined'
) {
return;
}
engagementTrackingEnabled = true;
trackerState.engagementTrackingEnabled = true;
resetEngagementState(now);
document.addEventListener('visibilitychange', () => {
trackerState.visibilityChangeHandler = () => {
const now = Date.now();
if (visibleSinceMs !== null) {
pendingEngagementMs += now - visibleSinceMs;
if (trackerState.visibleSinceMs !== null) {
trackerState.pendingEngagementMs += now - trackerState.visibleSinceMs;
}
visibleSinceMs = isDocumentVisible() ? now : null;
trackerState.visibleSinceMs = isDocumentVisible() ? now : null;
if (!isDocumentVisible()) {
dispatchUserEngagement(now);
}
});
};
document.addEventListener(
'visibilitychange',
trackerState.visibilityChangeHandler
);
if (typeof window !== 'undefined') {
window.addEventListener('pagehide', () => {
trackerState.pageHideHandler = () => {
dispatchUserEngagement(Date.now());
});
};
window.addEventListener('pagehide', trackerState.pageHideHandler);
}
}
function dispatchUserEngagement(now: number) {
if (!enabled) {
if (!trackerState.enabled) {
return;
}
const engagementMs = consumeEngagementTime(now);
@@ -377,7 +388,7 @@ function readCurrentSessionNumber(
const fallback = localStorage
? (readPositiveNumber(localStorage, SESSION_NUMBER_KEY) ?? 1)
: sessionNumber || 1;
: trackerState.sessionNumber || 1;
writeNumber(sessionStorage, SESSION_NUMBER_CURRENT_KEY, fallback);
if (localStorage && !readPositiveNumber(localStorage, SESSION_NUMBER_KEY)) {
@@ -391,7 +402,7 @@ function incrementSessionNumber(
sessionStorage: Storage | null
) {
if (!localStorage) {
const next = (sessionNumber || 0) + 1;
const next = (trackerState.sessionNumber || 0) + 1;
writeNumber(sessionStorage, SESSION_NUMBER_CURRENT_KEY, next);
return next;
}
@@ -410,10 +421,10 @@ function buildEvent(
schemaVersion: 1,
eventName,
params,
userId,
userProperties,
clientId,
sessionId,
userId: trackerState.userId,
userProperties: trackerState.userProperties,
clientId: trackerState.clientId,
sessionId: trackerState.sessionId,
eventId: nanoid(),
timestampMicros: Date.now() * 1000,
context: buildContext(),
@@ -445,33 +456,6 @@ function normalizeProperties(properties?: RawTrackProperties): TrackProperties {
return properties as Record<string, unknown>;
}
function readPersistentId(key: string, storage: Storage | null, renew = false) {
if (!storage) {
return nanoid();
}
if (!renew) {
const existing = storage.getItem(key);
if (existing) {
return existing;
}
}
const id = nanoid();
try {
storage.setItem(key, id);
} catch {
return id;
}
return id;
}
function localStorageSafe(): Storage | null {
try {
return typeof localStorage === 'undefined' ? null : localStorage;
} catch {
return null;
}
}
function sessionStorageSafe(): Storage | null {
try {
return typeof sessionStorage === 'undefined' ? null : sessionStorage;

View File

@@ -14,11 +14,13 @@ test.describe('AIAction/CheckCodeError', () => {
}) => {
const { checkCodeError } = await utils.editor.askAIWithCode(
page,
'consloe.log("Hello,World!");',
'console.log("Hello,World!"',
'javascript'
);
const { answer, responses } = await checkCodeError();
await expect(answer).toHaveText(/console/);
const answerText = await answer.innerText();
expect(answerText).toMatch(/syntax|parenthesis|unexpected|missing/i);
expect(answerText).not.toMatch(/No syntax errors were found/i);
await expect(responses).toEqual(
new Set(['insert-below', 'replace-selection'])
);

View File

@@ -33,6 +33,6 @@ test.describe('expand mindmap node', () => {
await expect(async () => {
const newChild = await utils.editor.getMindMapNode(page, id!, [0, 0, 0]);
expect(newChild).toBeDefined();
}).toPass({ timeout: 20000 });
}).toPass({ timeout: 60000 });
});
});

View File

@@ -17,7 +17,10 @@ test.describe('AIAction/ExplainSelection', () => {
'LLM(AI)'
);
const { answer, responses } = await explainSelection();
await expect(answer).toHaveText(/Large Language Model/, { timeout: 20000 });
await expect(answer).toHaveText(
/Large Language Model|LLM|artificial intelligence/i,
{ timeout: 20000 }
);
expect(responses).toEqual(new Set(['insert-below', 'replace-selection']));
});
@@ -33,7 +36,10 @@ test.describe('AIAction/ExplainSelection', () => {
);
const { answer, responses } = await explainSelection();
await expect(answer).toHaveText(/Large Language Model/, { timeout: 20000 });
await expect(answer).toHaveText(
/Large Language Model|LLM|artificial intelligence/i,
{ timeout: 20000 }
);
expect(responses).toEqual(new Set(['insert-below']));
});
@@ -49,7 +55,10 @@ test.describe('AIAction/ExplainSelection', () => {
);
const { answer, responses } = await explainSelection();
await expect(answer).toHaveText(/Large Language Model/, { timeout: 20000 });
await expect(answer).toHaveText(
/Large Language Model|LLM|artificial intelligence/i,
{ timeout: 20000 }
);
expect(responses).toEqual(new Set(['insert-below']));
});

View File

@@ -3,6 +3,8 @@ import { expect } from '@playwright/test';
import { test } from '../base/base-test';
test.describe('AIAction/GeneratePresentation', () => {
test.describe.configure({ timeout: 240000 });
test.beforeEach(async ({ loggedInPage: page, utils }) => {
await utils.testUtils.setupTestEnvironment(page);
await utils.chatPanel.openChatPanel(page);

View File

@@ -3,6 +3,8 @@ import { expect } from '@playwright/test';
import { test } from '../base/base-test';
test.describe('AIAction/MakeItReal', () => {
test.describe.configure({ timeout: 180000 });
test.beforeEach(async ({ loggedInPage: page, utils }) => {
await utils.testUtils.setupTestEnvironment(page);
await utils.chatPanel.openChatPanel(page);

View File

@@ -74,13 +74,13 @@ test.describe('AIChatWith/Attachments', () => {
buffer: buffer2,
},
],
`What is Attachment${randomStr1}? What is Attachment${randomStr2}?`
`Which animal is Attachment${randomStr1} and which animal is Attachment${randomStr2}? Answer with both attachment names.`
);
await utils.chatPanel.waitForHistory(page, [
{
role: 'user',
content: `What is Attachment${randomStr1}? What is Attachment${randomStr2}?`,
content: `Which animal is Attachment${randomStr1} and which animal is Attachment${randomStr2}? Answer with both attachment names.`,
},
{
role: 'assistant',
@@ -89,14 +89,11 @@ test.describe('AIChatWith/Attachments', () => {
]);
await expect(async () => {
const { content, message } =
await utils.chatPanel.getLatestAssistantMessage(page);
const { content } = await utils.chatPanel.getLatestAssistantMessage(page);
expect(content).toMatch(new RegExp(`Attachment${randomStr1}`));
expect(content).toMatch(new RegExp(`Attachment${randomStr2}`));
const footnoteCount = await message
.locator('affine-footnote-node')
.count();
expect(footnoteCount > 0 || /sources?/i.test(content)).toBe(true);
expect(content).toMatch(/cat/i);
expect(content).toMatch(/dog/i);
}).toPass({ timeout: 20000 });
});
});

View File

@@ -4,21 +4,39 @@ import { expect } from '@playwright/test';
import { test } from '../base/base-test';
type MindmapSnapshot = {
childCount: number;
count: number;
id: string | null;
};
test.describe('AIChatWith/EdgelessMindMap', () => {
test.describe.configure({ timeout: 180000 });
test.beforeEach(async ({ loggedInPage: page, utils }) => {
await utils.testUtils.setupTestEnvironment(page);
await utils.chatPanel.openChatPanel(page);
});
test('should support replace mindmap with the regenerated one', async ({
test('should preview the regenerated mindmap before replacing it', async ({
loggedInPage: page,
utils,
}) => {
let id: string;
let originalChildCount: number;
const { regenerateMindMap } = await utils.editor.askAIWithEdgeless(
page,
async () => {
id = await utils.editor.createMindmap(page);
originalChildCount = await page.evaluate(mindmapId => {
const edgelessBlock = document.querySelector(
'affine-edgeless-root'
) as EdgelessRootBlockComponent;
const mindmap = edgelessBlock.gfx.getElementById(mindmapId) as {
tree: { children?: unknown[] };
} | null;
return mindmap?.tree.children?.length ?? 0;
}, id);
},
async () => {
const { id: rootId } = await utils.editor.getMindMapNode(
@@ -30,22 +48,134 @@ test.describe('AIChatWith/EdgelessMindMap', () => {
}
);
const { answer } = await regenerateMindMap();
await expect(answer.locator('mini-mindmap-preview')).toBeVisible();
const replace = answer.getByTestId('answer-replace');
await replace.click();
const { answer, responses } = await regenerateMindMap();
expect(responses).toEqual(new Set(['replace-selection']));
await expect
.poll(
async () => {
return answer
.locator('mini-mindmap-preview')
.evaluate(async preview => {
const walk = (root: ParentNode): Element[] => {
const results: Element[] = [];
// Expect original mindmap to be replaced
const mindmaps = await page.evaluate(() => {
for (const element of root.querySelectorAll('*')) {
results.push(element);
if (element.shadowRoot) {
results.push(...walk(element.shadowRoot));
}
}
return results;
};
await customElements.whenDefined('mini-mindmap-preview');
const previewElement =
preview instanceof HTMLElement
? (preview as HTMLElement & {
updateComplete?: Promise<unknown>;
})
: null;
await previewElement?.updateComplete;
await new Promise(resolve =>
requestAnimationFrame(() => resolve(null))
);
const shadowRoot = previewElement?.shadowRoot ?? null;
const descendants = walk(shadowRoot ?? preview);
const surface = descendants.find(
element =>
element instanceof HTMLElement &&
element.classList.contains('affine-mini-mindmap-surface')
) as HTMLElement | undefined;
const surfaceRect = surface?.getBoundingClientRect();
return {
hasShadowRoot: !!shadowRoot,
hasRootBlock: descendants.some(
element =>
element.tagName.toLowerCase() === 'mini-mindmap-root-block'
),
hasSurfaceBlock: descendants.some(
element =>
element.tagName.toLowerCase() ===
'mini-mindmap-surface-block'
),
surfaceReady:
!!surface &&
(surfaceRect?.width ?? 0) > 0 &&
(surfaceRect?.height ?? 0) > 0,
};
});
},
{ timeout: 15_000 }
)
.toEqual({
hasShadowRoot: true,
hasRootBlock: true,
hasSurfaceBlock: true,
surfaceReady: true,
});
const replace = answer.getByTestId('answer-replace');
await expect(replace).toBeVisible();
await replace.click({ force: true });
await expect
.poll(
async () => {
return page.evaluate<MindmapSnapshot>(() => {
const edgelessBlock = document.querySelector(
'affine-edgeless-root'
) as EdgelessRootBlockComponent;
const mindmaps = edgelessBlock?.gfx.gfxElements.filter(
(el: GfxModel) => 'type' in el && el.type === 'mindmap'
) as unknown as Array<{
id: string;
tree: {
children?: unknown[];
element: { text?: { toString(): string } };
};
}>;
const mindmap = mindmaps?.[0];
return {
count: mindmaps?.length ?? 0,
id: mindmap?.id ?? null,
childCount: mindmap?.tree.children?.length ?? 0,
};
});
},
{ timeout: 15_000 }
)
.toMatchObject({
count: 1,
});
const replacedMindmap = await page.evaluate<MindmapSnapshot>(() => {
const edgelessBlock = document.querySelector(
'affine-edgeless-root'
) as EdgelessRootBlockComponent;
const mindmaps = edgelessBlock?.gfx.gfxElements
.filter((el: GfxModel) => 'type' in el && el.type === 'mindmap')
.map((el: GfxModel) => el.id);
return mindmaps;
const mindmaps = edgelessBlock?.gfx.gfxElements.filter(
(el: GfxModel) => 'type' in el && el.type === 'mindmap'
) as unknown as Array<{
id: string;
tree: {
children?: unknown[];
element: { text?: { toString(): string } };
};
}>;
const mindmap = mindmaps?.[0];
return {
count: mindmaps?.length ?? 0,
id: mindmap?.id ?? null,
childCount: mindmap?.tree.children?.length ?? 0,
};
});
expect(mindmaps).toHaveLength(1);
expect(mindmaps?.[0]).not.toBe(id!);
expect(replacedMindmap.childCount).toBeGreaterThan(originalChildCount!);
expect(replacedMindmap.childCount).toBeGreaterThan(0);
});
});

View File

@@ -189,20 +189,13 @@ test.describe('AISettings/Embedding', () => {
await utils.settings.closeSettingsPanel(page);
await utils.chatPanel.makeChat(
page,
`What is Workspace${randomStr1}? What is Workspace${randomStr2}?`
);
const query = `Use semantic search across workspace and attached files, then tell me whether Workspace${randomStr1} is a cat or dog and whether Workspace${randomStr2} is a cat or dog. Answer with citations.`;
await utils.chatPanel.makeChat(page, query);
await utils.chatPanel.waitForHistory(page, [
{
role: 'user',
content: `What is Workspace${randomStr1}? What is Workspace${randomStr2}?`,
},
{
role: 'assistant',
status: 'success',
},
{ role: 'user', content: query },
{ role: 'assistant', status: 'success' },
]);
await expect(async () => {

View File

@@ -90,17 +90,34 @@ export class EditorUtils {
return answer;
}
private static createAction(page: Page, action: () => Promise<void>) {
private static createAction(
page: Page,
action: () => Promise<void>,
options?: { responseTimeoutMs?: number }
) {
return async () => {
const responseTimeoutMs = options?.responseTimeoutMs ?? 60000;
await action();
await this.waitForAiAnswer(page);
await page.getByTestId('ai-generating').waitFor({
state: 'hidden',
timeout: 2 * 60000,
});
const responses = new Set<string>();
const answer = await this.waitForAiAnswer(page);
const responsesMenu = answer.getByTestId('answer-responses');
await responsesMenu.isVisible();
await responsesMenu.scrollIntoViewIfNeeded({ timeout: 60000 });
await responsesMenu.waitFor({
state: 'visible',
timeout: responseTimeoutMs,
});
await responsesMenu.scrollIntoViewIfNeeded({
timeout: responseTimeoutMs,
});
await responsesMenu
.getByTestId('answer-insert-below-loading')
.waitFor({ state: 'hidden' });
.waitFor({ state: 'hidden', timeout: responseTimeoutMs });
if (await responsesMenu.getByTestId('answer-insert-below').isVisible()) {
responses.add('insert-below');
@@ -458,8 +475,10 @@ export class EditorUtils {
generateOutline: this.createAction(page, () =>
page.getByTestId('action-generate-outline').click()
),
generatePresentation: this.createAction(page, () =>
page.getByTestId('action-generate-presentation').click()
generatePresentation: this.createAction(
page,
() => page.getByTestId('action-generate-presentation').click(),
{ responseTimeoutMs: 120000 }
),
imageProcessing: this.createAction(page, () =>
page.getByTestId('action-image-processing').click()
@@ -634,8 +653,10 @@ export class EditorUtils {
generateOutline: this.createAction(page, () =>
page.getByTestId('action-generate-outline').click()
),
generatePresentation: this.createAction(page, () =>
page.getByTestId('action-generate-presentation').click()
generatePresentation: this.createAction(
page,
() => page.getByTestId('action-generate-presentation').click(),
{ responseTimeoutMs: 120000 }
),
imageProcessing: this.createAction(page, () =>
page.getByTestId('action-image-processing').click()

View File

@@ -0,0 +1,197 @@
import { test } from '@affine-test/kit/playwright';
import {
type CanvasRendererPerfSnapshot,
deleteEdgelessElements,
getCanvasRendererPerfSnapshot,
resetCanvasRendererPerfMetrics,
seedEdgelessPerfScene,
} from '@affine-test/kit/utils/edgeless-perf';
import {
clickEdgelessModeButton,
dragView,
fitViewportToContent,
getEdgelessSelectedIds,
getSelectedXYWH,
locateEditorContainer,
setEdgelessTool,
setViewportZoom,
} from '@affine-test/kit/utils/editor';
import { openHomePage } from '@affine-test/kit/utils/load-page';
import {
clickNewPageButton,
waitForEditorLoad,
} from '@affine-test/kit/utils/page-logic';
import { expect } from '@playwright/test';
const PERF_ENV = 'AFFINE_RUN_PERF_E2E';
const perfEnabled = process.env[PERF_ENV] === '1';
const modKey = process.platform === 'darwin' ? 'Meta' : 'Control';
type PerfScenarioResult = {
name: string;
snapshot: CanvasRendererPerfSnapshot;
};
test.describe.serial('canvas renderer perf probes', () => {
test.skip(!perfEnabled, `Set ${PERF_ENV}=1 to run manual perf probes`);
test.beforeEach(async ({ page }) => {
await openHomePage(page);
await waitForEditorLoad(page);
await clickNewPageButton(page);
await clickEdgelessModeButton(page);
await locateEditorContainer(page).click();
});
test('collect metrics for common edgeless canvas scenarios', async ({
page,
}, testInfo) => {
test.slow();
const results: PerfScenarioResult[] = [];
let addedShapeIds: string[] = [];
const selectWholePerfScene = async () => {
await setEdgelessTool(page, 'default');
await dragView(page, [80, 140], [2300, 1500]);
await expect
.poll(async () => (await getEdgelessSelectedIds(page)).length)
.toBeGreaterThan(0);
};
const recordScenario = async (
name: string,
action: () => Promise<void>
) => {
await resetCanvasRendererPerfMetrics(page);
await action();
await page.waitForTimeout(400);
const snapshot = await getCanvasRendererPerfSnapshot(page);
results.push({ name, snapshot });
console.log(
`[canvas-perf] ${name}: ${JSON.stringify(snapshot.metrics, null, 2)}`
);
};
const initial = await seedEdgelessPerfScene(page, {
shapeCount: 120,
rowLength: 12,
startX: 120,
startY: 180,
width: 160,
height: 120,
});
addedShapeIds = initial.shapeIds;
await fitViewportToContent(page);
await page.waitForTimeout(500);
await recordScenario('add-shapes', async () => {
const seeded = await seedEdgelessPerfScene(page, {
shapeCount: 40,
rowLength: 10,
startX: 160,
startY: 1720,
width: 160,
height: 120,
});
addedShapeIds = addedShapeIds.concat(seeded.shapeIds);
await fitViewportToContent(page);
});
await recordScenario('delete-shapes', async () => {
await deleteEdgelessElements(page, addedShapeIds.slice(-20));
});
await recordScenario('box-select', async () => {
await selectWholePerfScene();
});
await recordScenario('group-selection', async () => {
await selectWholePerfScene();
await page.keyboard.press(`${modKey}+g`);
});
await recordScenario('ungroup-selection', async () => {
await page.keyboard.press(`${modKey}+Shift+g`);
});
await recordScenario('large-drag-selection', async () => {
await selectWholePerfScene();
const [x, y, w, h] = await getSelectedXYWH(page);
const center: [number, number] = [x + w / 2, y + h / 2];
await dragView(page, center, [center[0] + 1200, center[1] + 900]);
});
await recordScenario('large-pan', async () => {
await setEdgelessTool(page, 'pan');
await dragView(page, [1200, 900], [200, 180]);
});
await recordScenario('large-zoom', async () => {
await setViewportZoom(page, 0.25);
await page.waitForTimeout(200);
await setViewportZoom(page, 2.2);
await page.waitForTimeout(200);
await fitViewportToContent(page);
});
const finalSnapshot = await getCanvasRendererPerfSnapshot(page);
expect(finalSnapshot.rendererType).toBe('CanvasRenderer');
expect(results.length).toBeGreaterThanOrEqual(7);
await testInfo.attach('canvas-renderer-perf-scenarios.json', {
body: JSON.stringify(results, null, 2),
contentType: 'application/json',
});
});
test('collect metrics for interleaved block and canvas layers', async ({
page,
}, testInfo) => {
test.slow();
await seedEdgelessPerfScene(page, {
interleaved: true,
noteCount: 21,
shapeCount: 20,
rowLength: 1,
startX: 120,
startY: 180,
width: 180,
height: 120,
});
await fitViewportToContent(page);
await page.waitForTimeout(500);
const snapshot = await getCanvasRendererPerfSnapshot(page);
const metrics = snapshot.metrics as {
canvasMemoryMegabytes?: number;
lastRenderMetrics?: {
renderByBoundCallCount?: number;
};
stackingCanvasCount?: number;
visibleStackingCanvasCount?: number;
} | null;
console.log(
`[canvas-perf] interleaved-layers: ${JSON.stringify(snapshot, null, 2)}`
);
expect(snapshot.rendererType).toBe('CanvasRenderer');
expect(metrics).not.toBeNull();
expect(metrics?.stackingCanvasCount ?? 0).toBeGreaterThan(0);
expect(
metrics?.lastRenderMetrics?.renderByBoundCallCount ?? 0
).toBeGreaterThan(1);
expect(metrics?.visibleStackingCanvasCount ?? 0).toBeGreaterThan(0);
expect(metrics?.canvasMemoryMegabytes ?? 0).toBeLessThan(5);
await testInfo.attach('canvas-renderer-layering.json', {
body: JSON.stringify(snapshot, null, 2),
contentType: 'application/json',
});
});
});

View File

@@ -280,6 +280,27 @@ export async function loginUserDirectly(
}
}
async function dismissBlockingModal(page: Page) {
const modal = page.locator('modal-transition-container [data-modal="true"]');
if (
!(await modal
.first()
.isVisible()
.catch(() => false))
) {
return;
}
const closeButton = page.getByTestId('modal-close-button').last();
if (await closeButton.isVisible().catch(() => false)) {
await closeButton.click({ timeout: 5000 });
} else {
await page.keyboard.press('Escape');
}
await expect(modal.first()).toBeHidden({ timeout: 10000 });
}
export async function enableCloudWorkspace(page: Page) {
await clickSideBarSettingButton(page);
await page.getByTestId('workspace-setting:preference').click();
@@ -288,6 +309,7 @@ export async function enableCloudWorkspace(page: Page) {
// wait for upload and delete local workspace
await page.waitForTimeout(2000);
await waitForAllPagesLoad(page);
await dismissBlockingModal(page);
await clickNewPageButton(page);
}
@@ -303,6 +325,7 @@ export async function enableCloudWorkspaceFromShareButton(page: Page) {
// wait for upload and delete local workspace
await page.waitForTimeout(2000);
await waitForEditorLoad(page);
await dismissBlockingModal(page);
await clickNewPageButton(page);
}

Some files were not shown because too many files have changed in this diff Show More