Skip to content

Commit eecc5da

Browse files
authored
Fix build suggestions (#42787)
1 parent c0f54c6 commit eecc5da

9 files changed

+156
-158
lines changed

docs/ai/conceptual/understanding-tokens.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ As training continues, the model adds any new tokens in the training text to its
8181

8282
The semantic relationships between the tokens can be analyzed by using these token ID sequences. Multi-valued numeric vectors, known as [embeddings](embeddings.md), are used to represent these relationships. An embedding is assigned to each token based on how commonly it's used together with, or in similar contexts to, the other tokens.
8383

84-
After it's trained, a model can calculate an embedding for text that contains multiple tokens. The model tokenizes the text, then calculates an overall embeddings value based on the learned embeddings of the individual tokens. This technique can be used for semantic document searches or adding [memories](/semantic-kernel/memories/) to an AI.
84+
After it's trained, a model can calculate an embedding for text that contains multiple tokens. The model tokenizes the text, then calculates an overall embeddings value based on the learned embeddings of the individual tokens. This technique can be used for semantic document searches or adding [vector stores](/semantic-kernel/concepts/vector-store-connectors/) to an AI.
8585

8686
During output generation, the model predicts a vector value for the next token in the sequence. The model then selects the next token from it's vocabulary based on this vector value. In practice, the model calculates multiple vectors by using various elements of the previous tokens' embeddings. The model then evaluates all potential tokens from these vectors and selects the most probable one to continue the sequence.
8787

docs/ai/how-to/app-service-db-auth.md

+5-5
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ zone_pivot_groups: azure-interface
1313

1414
This article demonstrates how to manage the connection between your App Service .NET application and a [vector database solution](../conceptual/vector-databases.md). It covers using Microsoft Entra managed identities for supported services and securely storing connection strings for others.
1515

16-
By adding a vector database to your application, you can enable [semantic memories](/semantic-kernel/memories/) for your AI. The [Semantic Kernel SDK](/semantic-kernel/overview) for .NET enables you to easily implement memory storage and recall using your preferred vector database solution.
16+
By adding a vector database to your application, you can enable [semantic memories or *vector stores*]([vector stores](/semantic-kernel/concepts/vector-store-connectors/)) for your AI. The [Semantic Kernel SDK](/semantic-kernel/overview) for .NET enables you to easily implement memory storage and recall using your preferred vector database solution.
1717

1818
## Prerequisites
1919

@@ -22,7 +22,7 @@ By adding a vector database to your application, you can enable [semantic memori
2222
* [`Microsoft.SemanticKernel` NuGet package](https://www.nuget.org/packages/Microsoft.SemanticKernel)
2323
* [`Microsoft.SemanticKernel.Plugins.Memory` NuGet package](https://www.nuget.org/packages/Microsoft.SemanticKernel.Plugins.Memory)
2424
* [Create and deploy a .NET application to App Service](/azure/app-service/quickstart-dotnetcore)
25-
* [Create and deploy a vector database solution](/semantic-kernel/memories/vector-db)
25+
* [Create and deploy a vector database solution](/semantic-kernel/concepts/ai-services/integrations#vector-database-solutions)
2626

2727
## Use Microsoft Entra managed identity for authentication
2828

@@ -190,7 +190,7 @@ Before following these steps, retrieve a connection string for your vector datab
190190
> [!IMPORTANT]
191191
> Before following these steps, ensure you have [created a Key Vault using the Azure CLI](/azure/key-vault/general/quick-create-cli).
192192
193-
1. Grant your user account permissions to your key vault through Role-Based Access Control (RBAC), assign a role using the Azure CLI command [`az role assignment create`](/cli/azure/role/assignment?view=azure-cli-latest#az-role-assignment-create):
193+
1. Grant your user account permissions to your key vault through Role-Based Access Control (RBAC), assign a role using the Azure CLI command [`az role assignment create`](/cli/azure/role/assignment#az-role-assignment-create):
194194

195195
```azurecli
196196
az role assignment create \
@@ -199,7 +199,7 @@ Before following these steps, retrieve a connection string for your vector datab
199199
--scope "/subscriptions/<subscriptionId>/resourceGroups/<resourceGroupName>/providers/Microsoft.KeyVault/vaults/<keyVaultName>"
200200
```
201201
202-
1. Add the connection string to Key Vault using the Azure CLI command [`az keyvault secret set`](/cli/azure/keyvault/secret?view=azure-cli-latest#az-keyvault-secret-set):
202+
1. Add the connection string to Key Vault using the Azure CLI command [`az keyvault secret set`](/cli/azure/keyvault/secret#az-keyvault-secret-set):
203203
204204
```azurecli
205205
az keyvault secret set \
@@ -271,7 +271,7 @@ Before following these steps, retrieve a connection string for your vector datab
271271
272272
:::zone target="docs" pivot="azure-cli"
273273
274-
Add or edit an app setting with the Azure CLI command [`az webapp config connection-string set`](/cli/azure/webapp/config/connection-string?view=azure-cli-latest#az-webapp-config-connection-string-set):
274+
Add or edit an app setting with the Azure CLI command [`az webapp config connection-string set`](/cli/azure/webapp/config/connection-string#az-webapp-config-connection-string-set):
275275
276276
```azurecli
277277
az webapp config connection-string set \

docs/ai/how-to/use-redis-for-memory.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.date: 04/17/2024
1313

1414
This article demonstrates how to integrate a Redis database with the RediSearch module into the [Semantic Kernel SDK](/semantic-kernel/overview) and use it for memory storage and retrieval.
1515

16-
[Memories](/semantic-kernel/memories/) represent text information that has been stored alongside a precomputed embedding vector for the whole text. When an LLM is prompted to recall a memory, it uses these precomputed embeddings to efficiently evaluate whether a memory is relevant to the prompt. After the LLM finds a matching memory, it uses the memory's text information as context for the next steps in the prompt completion.
16+
[Vector stores](/semantic-kernel/concepts/vector-store-connectors/) represent text information that has been stored alongside a precomputed embedding vector for the whole text. When an LLM is prompted to recall a memory, it uses these precomputed embeddings to efficiently evaluate whether a memory is relevant to the prompt. After the LLM finds a matching memory, it uses the memory's text information as context for the next steps in the prompt completion.
1717

1818
Memory storage that's added to the Semantic Kernel SDK provides a broader context for your requests. It also enables you to store data in the same manner as you store a traditional database, but query it by using natural language.
1919

docs/ai/semantic-kernel-dotnet-overview.md

+10-10
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ In this article, you explore [Semantic Kernel](/semantic-kernel/overview) core c
1515
- How to add semantic kernel to your project
1616
- Semantic Kernel core concepts
1717

18-
The sections ahead serve as an introductory overview of Semantic Kernel specifically in the context of .NET. For more comprehensive information and training about Semantic Kernel, see the following resources:
18+
This article serves as an introductory overview of Semantic Kernel specifically in the context of .NET. For more comprehensive information and training about Semantic Kernel, see the following resources:
1919

2020
- [Semantic Kernel documentation](/semantic-kernel/overview)
2121
- [Semantic Kernel training](/training/paths/develop-ai-agents-azure-open-ai-semantic-kernel-sdk/)
@@ -115,7 +115,7 @@ string skPrompt = @"Summarize the provided unstructured text in a sentence that
115115

116116
// Register the function
117117
kernel.CreateSemanticFunction(
118-
promptTemplate: skPrompt,
118+
promptTemplate: skPrompt,
119119
functionName: "SummarizeText",
120120
pluginName: "SemanticFunctions"
121121
);
@@ -132,9 +132,9 @@ The following code snippet defines and registers a native function:
132132
public class NativeFunctions {
133133

134134
[SKFunction, Description("Retrieve content from local file")]
135-
public async Task<string> RetrieveLocalFile(string fileName, int maxSize = 5000)
135+
public async Task<string> RetrieveLocalFile(string fileName, int maxSize = 5000)
136136
{
137-
string content = await File.ReadAllTextAsync(fileName);
137+
string content = await File.ReadAllTextAsync(fileName);
138138
if (content.Length <= maxSize) return content;
139139
return content.Substring(0, maxSize);
140140
}
@@ -159,7 +159,7 @@ Consider the following pseudo-code snippet:
159159
160160
// Configure and create the plan
161161
string planDefinition = "Read content from a local file and summarize the content.";
162-
SequentialPlanner sequentialPlanner = new SequentialPlanner(kernel);
162+
SequentialPlanner sequentialPlanner = new SequentialPlanner(kernel);
163163

164164
string assetsFolder = @"../../assets";
165165
string fileName = Path.Combine(assetsFolder,"docs","06_SemanticKernel", "aci_documentation.txt");
@@ -178,21 +178,21 @@ The preceding code creates an executable, sequential plan to read content from a
178178

179179
### Memory
180180

181-
Semantic Kernel's [Memory](/semantic-kernel/memories) provides abstractions over embedding models, vector databases, and other data to simplify context management for AI applications. Memory is agnostic to the underlying LLM or Vector DB, offering a uniform developer experience. You can configure memory features to store data in a variety of sources or service, including Azure AI Search, Azure Cache for Redis, and more.
181+
Semantic Kernel's [Vector stores](/semantic-kernel/concepts/vector-store-connectors/) provide abstractions over embedding models, vector databases, and other data to simplify context management for AI applications. Vector stores are agnostic to the underlying LLM or Vector database, offering a uniform developer experience. You can configure memory features to store data in a variety of sources or service, including Azure AI Search and Azure Cache for Redis.
182182

183183
Consider the following code snippet:
184184

185185
```csharp
186186
var facts = new Dictionary<string,string>();
187187
facts.Add(
188-
"Azure Machine Learning; https://docs.microsoft.com/en-us/azure/machine-learning/",
189-
@"Azure Machine Learning is a cloud service for accelerating and
190-
managing the machine learning project lifecycle. Machine learning professionals,
188+
"Azure Machine Learning; https://learn.microsoft.com/en-us/azure/machine-learning/",
189+
@"Azure Machine Learning is a cloud service for accelerating and
190+
managing the machine learning project lifecycle. Machine learning professionals,
191191
data scientists, and engineers can use it in their day-to-day workflows"
192192
);
193193

194194
facts.Add(
195-
"Azure SQL Service; https://docs.microsoft.com/en-us/azure/azure-sql/",
195+
"Azure SQL Service; https://learn.microsoft.com/en-us/azure/azure-sql/",
196196
@"Azure SQL is a family of managed, secure, and intelligent products
197197
that use the SQL Server database engine in the Azure cloud."
198198
);

0 commit comments

Comments
 (0)