Skip to content

Commit f7bc6d2

Browse files
authored
feat: update origin 20.2.0 (#1048)
* fix: migrate new files * fix: migrate untranslated files * fix: migrate translated file changes - Add Design Patterns navigation item to sub-navigation-data.ts - Remove AI patterns streaming section from ai/overview.md to match English source 🤖 Generated with [Claude Code](https://claude.ai/code) * feat: update origin 20.2.0 Update origin submodule and English source files for version 20.2.0 🤖 Generated with [Claude Code](https://claude.ai/code)
1 parent b030792 commit f7bc6d2

File tree

8 files changed

+187
-111
lines changed

8 files changed

+187
-111
lines changed

adev-ja/src/app/sub-navigation-data.en.ts

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -685,6 +685,11 @@ const DOCS_SUB_NAVIGATION_DATA: NavigationItem[] = [
685685
path: 'ai/develop-with-ai',
686686
contentPath: 'ai/develop-with-ai',
687687
},
688+
{
689+
label: 'Design Patterns',
690+
path: 'ai/design-patterns',
691+
contentPath: 'ai/design-patterns',
692+
},
688693
{
689694
label: 'Angular CLI MCP Server setup',
690695
path: 'ai/mcp',

adev-ja/src/app/sub-navigation-data.ts

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -685,6 +685,11 @@ const DOCS_SUB_NAVIGATION_DATA: NavigationItem[] = [
685685
path: 'ai/develop-with-ai',
686686
contentPath: 'ai/develop-with-ai',
687687
},
688+
{
689+
label: '設計パターン',
690+
path: 'ai/design-patterns',
691+
contentPath: 'ai/design-patterns',
692+
},
688693
{
689694
label: 'Angular CLI MCPサーバーセットアップ',
690695
path: 'ai/mcp',
Lines changed: 172 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,172 @@
1+
# Design patterns for AI SDKs and signal APIs
2+
3+
Interacting with AI and Large Language Model (LLM) APIs introduces unique challenges, such as managing asynchronous operations, handling streaming data, and designing a responsive user experience for potentially slow or unreliable network requests. Angular [signals](guide/signals) and the [`resource`](guide/signals/resource) API provide powerful tools to solve these problems elegantly.
4+
5+
## Triggering requests with signals
6+
7+
A common pattern when working with user-provided prompts is to separate the user's live input from the submitted value that triggers the API call.
8+
9+
1. Store the user's raw input in one signal as they type
10+
2. When the user submits (e.g., by clicking a button), update a second signal with contents of the first signal.
11+
3. Use the second signal in the **`params`** field of your `resource`.
12+
13+
This setup ensures the resource's **`loader`** function only runs when the user explicitly submits their prompt, not on every keystroke. You can use additional signal parameters, like a `sessionId` or `userId` (which can be useful for creating persistent LLM sessions), in the `loader` field. This way, the request always uses these parameters' current values without re-triggering the asyncronous function defined in the `loader` field.
14+
15+
Many AI SDKs provide helper methods for making API calls. For example, the Genkit client library exposes a `runFlow` method for calling Genkit flows, which you can call from a resource's `loader`. For other APIs, you can use the [`httpResource`](guide/signals/resource#reactive-data-fetching-with-httpresource).
16+
17+
The following example shows a `resource` that fetches parts of an AI-generated story. The `loader` is triggered only when the `storyInput` signal changes.
18+
19+
```ts
20+
// A resource that fetches three parts of an AI generated story
21+
storyResource = resource({
22+
// The default value to use before the first request or on error
23+
defaultValue: DEFAULT_STORY,
24+
// The loader is re-triggered when this signal changes
25+
params: () => this.storyInput(),
26+
// The async function to fetch data
27+
loader: ({params}): Promise<StoryData> => {
28+
// The params value is the current value of the storyInput signal
29+
const url = this.endpoint();
30+
return runFlow({ url, input: {
31+
userInput: params,
32+
sessionId: this.storyService.sessionId() // Read from another signal
33+
}});
34+
}
35+
});
36+
```
37+
38+
## Preparing LLM data for templates
39+
40+
You can configure LLM APIs to return structured data. Strongly typing your `resource` to match the expected output from the LLM provides better type safety and editor autocompletion.
41+
42+
To manage state derived from a resource, use a `computed` signal or `linkedSignal`. Because `linkedSignal` [provides access to prior values](guide/signals/linked-signal), it can serve a variety of AI-related use cases, including
43+
* building a chat history
44+
* preserving or customizing data that templates display while LLMs generate content
45+
46+
In the example below, `storyParts` is a `linkedSignal` that appends the latest story parts returned from `storyResource` to the existing array of story parts.
47+
48+
```ts
49+
storyParts = linkedSignal<string[], string[]>({
50+
// The source signal that triggers the computation
51+
source: () => this.storyResource.value().storyParts,
52+
// The computation function
53+
computation: (newStoryParts, previous) => {
54+
// Get the previous value of this linkedSignal, or an empty array
55+
const existingStoryParts = previous?.value || [];
56+
// Return a new array with the old and new parts
57+
return [...existingStoryParts, ...newStoryParts];
58+
}
59+
});
60+
```
61+
62+
## Performance and user experience
63+
64+
LLM APIs may be slower and more error-prone than conventional, more deterministic APIs. You can use several Angular features to build a performant and user-friendly interface.
65+
66+
* **Scoped Loading:** place the `resource` in the component that directly uses the data. This helps limit change detection cycles (especially in zoneless applications) and prevents blocking other parts of your application. If data needs to be shared across multiple components, provide the `resource` from a service.
67+
* **SSR and Hydration:** use Server-Side Rendering (SSR) with incremental hydration to render the initial page content quickly. You can show a placeholder for the AI-generated content and defer fetching the data until the component hydrates on the client.
68+
* **Loading State:** use the `resource` `LOADING` [status](guide/signals/resource#resource-status) to show an indicator, like a spinner, while the request is in flight. This status covers both initial loads and reloads.
69+
* **Error Handling and Retries:** use the `resource` [**`reload()`**](guide/signals/resource#reloading) method as a simple way for users to retry failed requests, may be more prevalent when relying on AI generated content.
70+
71+
The following example demonstrates how to create a responsive UI to dynamically display an AI generated image with loading and retry functionality.
72+
73+
```angular-html
74+
<!-- Display a loading spinner while the LLM generates the image -->
75+
@if (imgResource.isLoading()) {
76+
<div class="img-placeholder">
77+
<mat-spinner [diameter]="50" />
78+
</div>
79+
<!-- Dynamically populates the src attribute with the generated image URL -->
80+
} @else if (imgResource.hasValue()) {
81+
<img [src]="imgResource.value()" />
82+
<!-- Provides a retry option if the request fails -->
83+
} @else {
84+
<div class="img-placeholder" (click)="imgResource.reload()">
85+
<mat-icon fontIcon="refresh" />
86+
<p>Failed to load image. Click to retry.</p>
87+
</div>
88+
}
89+
```
90+
91+
92+
## AI patterns in action: streaming chat responses
93+
Interfaces often display partial results from LLM-based APIs incrementally as response data arrives. Angular's resource API provides the ability to stream responses to support this type of pattern. The `stream` property of `resource` accepts an asyncronous function you can use to apply updates to a signal value over time. The signal being updated represents the data being streamed.
94+
95+
```ts
96+
characters = resource({
97+
stream: async () => {
98+
const data = signal<ResourceStreamItem<string>>({value: ''});
99+
// Calls a Genkit streaming flow using the streamFlow method
100+
// expose by the Genkit client SDK
101+
const response = streamFlow({
102+
url: '/streamCharacters',
103+
input: 10
104+
});
105+
106+
(async () => {
107+
for await (const chunk of response.stream) {
108+
data.update((prev) => {
109+
if ('value' in prev) {
110+
return { value: `${prev.value} ${chunk}` };
111+
} else {
112+
return { error: chunk as unknown as Error };
113+
}
114+
});
115+
}
116+
})();
117+
118+
return data;
119+
}
120+
});
121+
```
122+
123+
The `characters` member is updated asynchronously and can be displayed in the template.
124+
125+
```angular-html
126+
@if (characters.isLoading()) {
127+
<p>Loading...</p>
128+
} @else if (characters.hasValue()) {
129+
<p>{{characters.value()}}</p>
130+
} @else {
131+
<p>{{characters.error()}}</p>
132+
}
133+
```
134+
135+
On the server side, in `server.ts` for example, the defined endpoint sends the data to be streamed to the client. The following code uses Gemini with the Genkit framework but this technique is applicable to other APIs that support streaming responses from LLMs:
136+
137+
```ts
138+
import { startFlowServer } from '@genkit-ai/express';
139+
import { genkit } from "genkit/beta";
140+
import { googleAI, gemini20Flash } from "@genkit-ai/googleai";
141+
142+
const ai = genkit({ plugins: [googleAI()] });
143+
144+
export const streamCharacters = ai.defineFlow({
145+
name: 'streamCharacters',
146+
inputSchema: z.number(),
147+
outputSchema: z.string(),
148+
streamSchema: z.string(),
149+
},
150+
async (count, { sendChunk }) => {
151+
const { response, stream } = ai.generateStream({
152+
model: gemini20Flash,
153+
config: {
154+
temperature: 1,
155+
},
156+
prompt: `Generate ${count} different RPG game characters.`,
157+
});
158+
159+
(async () => {
160+
for await (const chunk of stream) {
161+
sendChunk(chunk.content[0].text!);
162+
}
163+
})();
164+
165+
return (await response).text;
166+
});
167+
168+
startFlowServer({
169+
flows: [streamCharacters],
170+
});
171+
172+
```

adev-ja/src/content/ai/mcp-server-setup.en.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -136,4 +136,4 @@ The Angular CLI MCP server provides several tools to assist you in your developm
136136

137137
## Feedback and New Ideas
138138

139-
The Angular team welcomes your feedback on the existing MCP capabilities and any ideas you have for new tools or features. Please share your thoughts by opening an issue on the [angular/angular GitHub repository](https://github.com/angular/angular/issues).
139+
The Angular team welcomes your feedback on the existing MCP capabilities and any ideas you have for new tools or features. Please share your thoughts by opening an issue on the [angular/angular GitHub repository](https://github.com/angular/angular/issues).

adev-ja/src/content/ai/overview.en.md

Lines changed: 0 additions & 54 deletions
Original file line numberDiff line numberDiff line change
@@ -60,60 +60,6 @@ The [Gemini API](https://ai.google.dev/gemini-api/docs) provides access to state
6060

6161
* [AI Chatbot app template](https://github.com/FirebaseExtended/firebase-framework-tools/tree/main/starters/angular/ai-chatbot) - This template starts with a chatbot user interface that communicates with the Gemini API via HTTP.
6262

63-
## AI patterns in action: Streaming chat responses
64-
Having text appear as the response is received from the model is a common UI pattern for web apps using AI. You can achieve this asynchronous task with Angular's `resource` API. The `stream` property of `resource` accepts an asynchronous function you can use to apply updates to a signal value over time. The signal being updated represents the data being streamed.
65-
66-
```ts
67-
characters = resource({
68-
stream: async () => {
69-
const data = signal<{ value: string } | { error: unknown }>({
70-
value: "",
71-
});
72-
73-
fetch(this.url).then(async (response) => {
74-
if (!response.body) return;
75-
76-
for await (const chunk of response.body) {
77-
const chunkText = this.decoder.decode(chunk);
78-
data.update((prev) => {
79-
if ("value" in prev) {
80-
return { value: `${prev.value} ${chunkText}` };
81-
} else {
82-
return { error: chunkText };
83-
}
84-
});
85-
}
86-
});
87-
88-
return data;
89-
},
90-
});
91-
92-
```
93-
94-
The `characters` member is updated asynchronously and can be displayed in the template.
95-
96-
```html
97-
<p>{{ characters.value() }}</p>
98-
```
99-
100-
On the server side, in `server.ts` for example, the defined endpoint sends the data to be streamed to the client. The following code uses the Gemini API but this technique is applicable to other tools and frameworks that support streaming responses from LLMs:
101-
102-
```ts
103-
app.get("/api/stream-response", async (req, res) => {
104-
ai.models.generateContentStream({
105-
model: "gemini-2.0-flash",
106-
contents: "Explain how AI works",
107-
}).then(async (response) => {
108-
for await (const chunk of response) {
109-
res.write(chunk.text);
110-
}
111-
});
112-
});
113-
114-
```
115-
This example connects to the Gemini API but other APIs that support streaming responses can be used here as well. [You can find the complete example on the Angular Github](https://github.com/angular/examples/tree/main/streaming-example).
116-
11763
## Best Practices
11864
### Connecting to model providers and keeping your API Credentials Secure
11965
When connecting to model providers, it is important to keep your API secrets safe. *Never put your API key in a file that ships to the client, such as `environments.ts`*.

adev-ja/src/content/ai/overview.md

Lines changed: 0 additions & 54 deletions
Original file line numberDiff line numberDiff line change
@@ -60,60 +60,6 @@ Firebase AI LogicとAngularで構築する方法の例を次に示します。
6060

6161
* [AIチャットボットアプリテンプレート](https://github.com/FirebaseExtended/firebase-framework-tools/tree/main/starters/angular/ai-chatbot) - このテンプレートは、HTTP経由でGemini APIと通信するチャットボットユーザーインターフェースから始まります。
6262

63-
## AIパターン実践: チャット応答のストリーミング {#ai-patterns-in-action-streaming-chat-responses}
64-
モデルから応答が受信されるにつれてテキストが表示されるのは、AIを使用するWebアプリケーションで一般的なUIパターンです。この非同期タスクはAngularの`resource` APIで実現できます。`resource``stream`プロパティは、時間の経過とともにシグナル値に更新を適用するために使用できる非同期関数を受け入れます。更新されるシグナルは、ストリーミングされるデータを表します。
65-
66-
```ts
67-
characters = resource({
68-
stream: async () => {
69-
const data = signal<{ value: string } | { error: unknown }>({
70-
value: "",
71-
});
72-
73-
fetch(this.url).then(async (response) => {
74-
if (!response.body) return;
75-
76-
for await (const chunk of response.body) {
77-
const chunkText = this.decoder.decode(chunk);
78-
data.update((prev) => {
79-
if ("value" in prev) {
80-
return { value: `${prev.value} ${chunkText}` };
81-
} else {
82-
return { error: chunkText };
83-
}
84-
});
85-
}
86-
});
87-
88-
return data;
89-
},
90-
});
91-
92-
```
93-
94-
`characters`メンバーは非同期で更新され、テンプレートに表示できます。
95-
96-
```html
97-
<p>{{ characters.value() }}</p>
98-
```
99-
100-
サーバー側では、例えば`server.ts`で、定義されたエンドポイントがクライアントにストリーミングされるデータを送信します。以下のコードはGemini APIを使用していますが、この手法はLLMからのストリーミング応答をサポートする他のツールやフレームワークにも適用可能です。
101-
102-
```ts
103-
app.get("/api/stream-response", async (req, res) => {
104-
ai.models.generateContentStream({
105-
model: "gemini-2.0-flash",
106-
contents: "Explain how AI works",
107-
}).then(async (response) => {
108-
for await (const chunk of response) {
109-
res.write(chunk.text);
110-
}
111-
});
112-
});
113-
114-
```
115-
この例はGemini APIに接続していますが、ストリーミング応答をサポートする他のAPIもここで使用できます。[完全な例はAngularのGithubで見つけることができます](https://github.com/angular/examples/tree/main/streaming-example)
116-
11763
## ベストプラクティス
11864
### モデルプロバイダーへの接続とAPI認証情報の保護 {#connecting-to-model-providers-and-keeping-your-api-credentials-secure}
11965
モデルプロバイダーに接続する際は、APIシークレットを安全に保つことが重要です。*APIキーを`environments.ts`のようなクライアントに配布されるファイルに決して含めないでください*

adev-ja/src/content/guide/routing/data-resolvers.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -266,8 +266,10 @@ While data resolvers prevent loading states within components, they introduce a
266266
To improve user experience during resolver execution, you can listen to router events and show loading indicators:
267267

268268
```angular-ts
269-
import { Component, inject, computed } from '@angular/core';
269+
import { Component, inject } from '@angular/core';
270270
import { Router } from '@angular/router';
271+
import { toSignal } from '@angular/core/rxjs-interop';
272+
import { map } from 'rxjs';
271273
272274
@Component({
273275
selector: 'app-root',

0 commit comments

Comments
 (0)