Content Developer II at Microsoft, working remotely in PA, TechBash conference organizer, former Microsoft MVP, Husband, Dad and Geek.
121899 stories
·
29 followers

Learn Azure Together in Microsoft Learn Learning Room

1 Share

In the summer of 2023, Microsoft Learn introduced the Microsoft Learn Learning Room, a new addition that allows community members worldwide to interact online and learn about technology. The Learning Room is part of the Microsoft Learn Community, and each Learning Room is managed by Microsoft Learn experts. It serves as an online community where users can connect with experts to gain deeper insight into Microsoft products and services.

 

One of the Microsoft Learn experts is Hamid Sadeghpour Saleh, an Azure MVP from Azerbaijan, who is also a Microsoft MVP. He manages the Learning Room alongside fellow Microsoft Learn experts Saeid Dahl and Mohsen Akhavan. We spoke with Hamid about the significance of participating in the Microsoft Zero to Hero Community Learning Room. Whether you are already a member of the Learning Room or have yet to join, we invite you to read this article and explore the new learning opportunities that this community offers.

Hamid Sadeghpour Saleh.jpg

 

----------

Introduction of Microsoft Zero to Hero Community

Microsoft Zero to Hero Community is a Learning Room on Microsoft Learn and We're here to cover all the aspects of Microsoft Cloud Services from Azure to Microsoft AI and Data - following our room title "Zero to Hero" We're here together to grow as passionate Microsoft Cloud experts.

 

Who are the members in the Microsoft Zero to Hero Community?

We have a variety of members in the Learning Room, from students, learners, new starters to experienced professionals, MVPs, RDs and Microsoft FTEs who contribute all together with AMA sessions, Q/A, blogging, weekly and monthly sessions to all share together and that is the main idea of the Learning Room.

 

What topics are discussed in the Microsoft Zero to Hero Community?

Since we are covering Microsoft Cloud Technologies in general, we host sessions from Microsoft Azure services to Microsoft AI, Data, Security etc. as well as all of the activities and channels we have are Technology focused on different ways and areas and that helps us and learners to land on their favorite learning path and get specialized and certified.

 

How do you support the skill development of members?

As an MVP and learning room owner, I support the skill development of members by facilitating interactive discussions, organizing workshops, sharing resources, and providing mentorship. I create a supportive environment where members can learn from each other's expertise, collaborate on projects, and stay updated on the latest technologies and best practices. Additionally, I offer personalized guidance and feedback to help members overcome challenges and achieve their learning goals.

 

Invitation for those who are interested in participating in your Learning Room

Join Microsoft Zero to Hero Learning Room!

Unlock endless opportunities for growth and collaboration. Engage in interactive discussions, attend workshops, and access resources curated for your success. Let's learn and excel together!

 

Resources for learners who want to join

- Link to join to the Learning Room: https://aka.ms/JoinZerotoHero

- Learning Room LinkedIn: https://www.linkedin.com/company/azure-zero-to-hero/

- Learning Room YouTube Channel: https://www.youtube.com/@azurezerotohero

----------

 

For more insights into Azure skills development as discussed by Hamid, please refer to the article on the Microsoft Learn Blog: Discover Microsoft Azure learning, training, certifications, and career path opportunities - Microsoft Community Hub

 

In addition to the Microsoft Zero to Hero Community featured in this article, the Microsoft Learn Learning Room operates various Learning Rooms for different products, services, and languages. We encourage you to explore the Microsoft Learn - Learning Rooms Directory, find a Learning Room that matches your interests, connect with new peers, and deepen your technical knowledge as a part of this learning community.

Learning Room Directory.jpg

 

Read the whole story
alvinashcraft
5 hours ago
reply
West Grove, PA
Share this story
Delete

Train a simple Recommendation Engine using the new Azure AI Studio

1 Share

Hi, everyone! I am Paschal Alaemezie, a Gold Microsoft Learn Student Ambassador. I am a student at the Federal University of Technology, Owerri (FUTO). I am interested in Artificial Intelligence, Software Engineering, and Emerging technologies, and how to apply the knowledge from these technologies in writing and building cool solutions to the challenges we face. Feel free to connect with me on LinkedIn and GitHub or follow me on X (Twitter).

 

In my previous article, I wrote about Recommendation Engines and gave a walkthrough on how to train a simple recommendation engine using the Azure Machine Learning Designer via the Azure portal. In this article, I will give a walkthrough on how to replicate this training using Azure Machine Learning Designer via the new Azure AI Studio.

 

The new Azure AI Studio is a comprehensive platform designed to facilitate the development, management, and deployment of AI applications. It offers a user-friendly interface with drag-and-drop capabilities for model creation, alongside advanced features for model management and scalability. The platform supports automated machine learning to optimize model selection and tuning. It is suitable for creating custom AI solutions, including chatbots and other AI-driven applications, with a focus on collaboration, efficiency, and responsible AI practices. Azure AI Studio is available in public preview, providing a glimpse into the future of AI development tools.

 

An Azure subscription is required to carry out the activities in this article. If you are a student, you can use your university or school email to sign up for a free Azure for Students account and start building on the Azure cloud with a free $100 Azure credit.

 

Activity 1: Create a New Training Pipeline

Step 1: Setting up your Azure AI Studio workspace

  1. Open your web browser and go to ai.azure.com to open the new Azure AI Studio

palaemezie_0-1713956744085.png

 

  1. Go to Build on the Azure AI Studio and click on it to open the Build environment. Then click on + New project button to open the Create a project environment.

palaemezie_1-1713956744094.png

 

Step 2: Creating your project 

  1. For the Project details section:
  2.  At Hub name, key in your preferred name for your project’s hub in the input box provided.
  3. At Subscription, select your existing subscription from the drop-down menu.
  4. Select your Resource group. If you have any existing resource group, select it from the drop-down menu. Otherwise, click on Create new to create a new resource group, and click OK after that.
  5. At Location, select your location from the drop-down menu. Then, click on the Next button at the bottom of the screen to go to the Review and finish.

palaemezie_2-1713956744099.png

 

  1. At the Review and finish section, click on Create a project button at the bottom of the screen to provision your workspace on Azure AI Studio.

palaemezie_3-1713956744106.png

 

  1. Your provisioned workspace will display the window below. Go to the All Azure AI at the upper right of the screen and select the Azure Machine Learning Studio from the drop-down menu.

 

palaemezie_4-1713956744112.png

 

 

  1. In the Azure Machine Learning studio, select Designer from the navigation pane on the left-hand side. This will open the Designer environment where you can select a new pipeline if there is no existing pipeline.

palaemezie_5-1713956744137.png

 

  1. In the Designer environment, select the Classic prebuilt component. Then click on the Create a new pipeline using classic prebuilt components. This will open a visual pipeline authoring editor.

palaemezie_6-1713956744147.png

 

Step 3: Add Sample Datasets

  1. In the left navigation pane of the Authoring editor, click the Asset library and go to the Component section. Under Component, click on Sample data.

palaemezie_7-1713956744153.png

 

  1. In the Sample data, scroll down to the Movie Ratings, and IMDB Movie Titles. Drag and drop the selected datasets onto the canvas.

palaemezie_8-1713956744159.png

 

Step 4: Join the two datasets on Movie ID

  1.  Close the Sample data drop-down menu. From the Data Transformation section in the left navigation, select the Join Data prebuilt module, and drag and drop the selected module onto the canvas
    1. Connect the output of the Movie Ratings module to the first input of the Join Data module.
    2. Connect the output of the IMDB Movie Titles module to the second input of the Join Data module.

palaemezie_9-1713956744162.png

 

  1. Select the Join Data module. Click the navigation button at the upper right of the canvas to open the Join Data module window.

palaemezie_10-1713956744164.png

 

  1. Select the Edit column link to open the Join key columns for the left dataset editor. Select the MovieId column in the Enter column name field and click Save.

palaemezie_11-1713956744167.png

 

  1. Select the Edit column link to open the Join key columns for the right dataset editor. Select the Movie ID column in the Enter column name field and click Save. Then, close the Join Data window.

palaemezie_12-1713956744170.png

 

Step 5: Select Columns UserId, Movie Name, and Rating using a Python script

  1. From the Python Language section in the left navigation, select the Execute Python Script prebuilt module. Drag and drop the selected module onto the canvas. Then, connect the Join Data output to the input of the Execute Python Script module.

palaemezie_13-1713956744174.png

 

  1. Select Edit code to open the Python script editor, clear the existing code and then enter the following lines of code to select the UserId, Movie Name, and Rating columns from the joined dataset. Ensure best practice by indenting only the second and third lines of your code.

palaemezie_14-1713956744180.png

 

Step 6: Remove duplicate rows with the same Movie Name and UserId

  1.  From the Data Transformation section in the left navigation pane, select the Remove Duplicate Rows prebuilt module from the drop-down menu, and drag and drop the selected module onto the canvas.
    1. Connect the first output of the Execute Python Script to the input of the Remove Duplicate Rows module.

palaemezie_15-1713956744182.png

 

    1. Select the Edit column link to open the Select column editor. Click the navigation button at the upper right of the canvas to open the Remove Duplicate Rows module window.

palaemezie_16-1713956744185.png

 

    1.  Enter the following list of columns to be included in the output dataset: Movie NameUserId. Then, click Save.

palaemezie_17-1713956744186.png

 

Step 7: Split the dataset into a training set (0.5) and a test set (0.5)

  1.  From the Data Transformation section in the left navigation select the Split Data prebuilt module and drag and drop the selected module onto the canvas, then connect the Dataset to the Split Data module.

 

  1.  Click the navigation button at the upper right of the canvas to open the Split Data module window. Ensure that the Fraction of rows in the first output dataset0.5

 

palaemezie_18-1713956744188.png

 

Step 8: Initialize Recommendation Module

  1.  From the Recommendation section in the left navigation pane, select the Train SVD Recommender prebuilt module and drag and drop the selected module onto the canvas. Then, connect the first output of the Split Data module to the input of the Train SVD Recommender module.

palaemezie_19-1713956744190.png

 

    1. Click the navigation button at the upper right of the canvas to open the Train SVD Recommender module window. Set Number of factors200. This option specifies the number of factors to use with the recommender.
    2. Number of recommendation algorithm iterations30. This number indicates how many times the algorithm should process the input data. The default value is 30.
    3. For Learning rate0.001. The learning rate defines the step size for learning.

palaemezie_20-1713956744192.png

 

Step 9: Select Columns UserId, Movie Name from the test set

  1. From the Data Transformation section in the left navigation pane, select the Select Columns in Dataset prebuilt module and drag and drop the selected module onto the canvas. Then, connect the Split Data second output to the input of the Select columns in Dataset module.

palaemezie_21-1713956744195.png

 

  1. Click the navigation button at the upper right of the canvas to open the Select Columns in Dataset module window. Select the Edit column link to open the Select columns editor.

palaemezie_22-1713956744196.png

 

  1. Enter the following list of columns to be included in the output dataset: UserIdMovie Name and Click Save.

palaemezie_23-1713956744198.png

 

Step 10: Configure the Score SVD Recommender

  1. From the Recommendation section in the left navigation pane, select the Score SVD Recommender prebuilt module and drag and drop the selected module onto the canvas
    1. Connect the output of the Train SVD Recommender module to the first input of the Score SVD Recommender module, which is the Trained SVD recommendation input.
    2. Connect the output of the Select Columns in Dataset module to the second input of the Score SVD Recommender module, which is the Dataset to score input.

palaemezie_24-1713956744202.png

 

    1. Open the Score SVD Recommender module on the canvas by clicking on the navigation button at the upper right of the canvas. Set the Recommender prediction kindRating Prediction. For this option, no other parameters are required.

palaemezie_25-1713956744203.png

 

Step 11: Setup Evaluate Recommender Module

  1. From the Recommendation section in the left navigation pane, select the Evaluate Recommender prebuilt module and drag and drop the selected module onto the canvas.
    1. Connect the Score SVD Recommender module to the second input of the Evaluate Recommender module, which is the Scored dataset input.
    2. Connect the second output of the Split Data module (train set) to the first input of the Evaluate Recommender module, which is the Test dataset input.

palaemezie_26-1713956744207.png

 

Activity 2: Submit Training Pipeline

  1. In the Authoring editor, ensure that you have AutoSave enabled. Then click on Configure & Submit at the upper right-hand side of your screen.

 

 

  1. For the Set up pipeline job window: In the Basics section, click the Create new button under the Experiment name. Type your new experiment name and click the Next button at the bottom of the screen.

palaemezie_28-1713956744212.png

 

  1. In the Inputs & outputs section, click the Next button at the bottom of the screen.

palaemezie_29-1713956744214.png

 

  1. In the Runtime settings section: skip the Default compute. Go to the select compute type and select Compute instance from the drop-down menu. Under the Select Azure ML compute instance, click on Create Azure ML compute instance. The Create compute instance will open in another environment.

 

palaemezie_30-1713956744220.png

 

 

  1. In the Create compute instance window, type in your compute name under the Compute name tab. Then, select the CPU button under the Virtual machine type.

palaemezie_31-1713956744225.png

 

  1. While authoring this article, I had to select my virtual machine first to enable the Compute name tab. You may or may not encounter this issue. I selected the Standard_D2_v2 virtual machine for this training. After that, click the Review + Create button at the end of the screen, to take you back to the Runtime settings window.

palaemezie_32-1713956744229.png

  1. Back to the Runtime settings window. At the Select Azure ML compute instance, Select the compute instance that you have created. Here, I selected the movie instance from the drop-down menu. Note that your newly created compute instance will take some time to be provisioned and appear in your drop-down menu. Go to the Advanced settings and ensure that the Continue on step failure box is checked. Then, click the Review + Submit button at the end of the screen.

palaemezie_33-1713956744233.png

  1. At the Review + Submit section, ensure that your provided details are correct. Then, click the Submit button at the end of the screen.

palaemezie_34-1713956744237.png

 

 

Activity 3: Visualize Scoring Results

Step 1: When your pipeline is submitted and your model training is completed, at the left navigation pane, go to Jobs under Asset and click on the name of your completed pipeline.

palaemezie_35-1713956744253.png

 

Step 2: Visualize the Scored dataset

  1. Go to the Score SVD Recommender module on the canvas and right-click on it. Select Preview data and click on Scored dataset.

palaemezie_36-1713956744279.png

 

  1. Observe the predicted values under the column Rating.

palaemezie_37-1713956744296.png

 

Step 3: Visualize the Evaluation Results

  1. Go to the Evaluate Recommender module on the canvas and right-click on it. Select Preview data and click on Metric.

palaemezie_38-1713956744313.png

 

  1. Evaluate the model performance by reviewing the various evaluation metrics, such as Mean Absolute ErrorRoot Mean Squared Error, etc.

palaemezie_39-1713956744323.png

 

 Next step

Congratulations, on making it this far. Stay tuned for my next blog on the amazing solutions you can build using the Azure AI Studio.

 

For enthusiasts and professionals alike, you can leverage these resources to stay informed and inspired as you embark on your AI journey:

Read the whole story
alvinashcraft
5 hours ago
reply
West Grove, PA
Share this story
Delete

Some more C# 12

1 Share

In the previous article on C# 12, you learned about collection expressions and primary constructors. In this article, we’ll take a look at some advanced features that part of the latest C# version: inline arrays, optional params and params in lambda expressions, ref readonly parameters, aliasing any type, and the UnsafeAccessorAttribute.

Inline arrays

A regular C# array is a reference type that lives on the heap. Like other reference types, the garbage collector (GC) keeps track whether the array is still referenced, and it frees up memory when the array is no longer in use.

To avoid the GC overhead in performance sensitive code, when a small array is needed that is local to a function, it can be allocated on the stack using stackalloc. Thanks to the Span<T> type introduced in .NET Core 2.1 we can use such arrays without resorting to "unsafe" code.

int[] bufferOnHeap = new int[1024];
Span<int> bufferOnStack = stackalloc int[128];

C# also allows us to allocate memory for an array as part of a struct. This can be interesting for performance, and also for interop to match a native type’s layout. Before C# 12, such arrays were declared using the fixed keyword, limited to primitive numeric types, and required using unsafe code. The following code compiles and makes no issue about the illegal out-of-bound access at compile time or run time.

Foo();

unsafe void Foo() {
  MyStruct s = default;
  s.Buffer[15] = 20; // Out-of-bounds access not caught.
}

unsafe struct MyStruct {
  public fixed byte Buffer[10];
}

C# 12 improves the situation, and allows declaring inline arrays and accessing them in a safe way. The buffer must be declared as a struct type with a single field for the element type and an InlineArray attribute with the length. The element type is also no longer limited to primitive numeric types. When we update the previous example to C# 12, at compile time, we get an error for the out-of-bounds access.

void Foo() {
  MyStruct s = default;
  s.Buffer[15] = 20; // CS9166: out-of-bounds access
}

struct MyStruct {
  public MyBuffer Buffer;
}

[InlineArray(10)]
struct MyBuffer {
  private byte _element;
}

As shown in the example, the buffer type supports indexing using an int. Indexing using an Index or Range type also works.

The buffer type also converts implicity to Span<T> and ReadOnlySpan<T>, and it can also be used in a foreach.

You can add members to the buffer type that operate on the stored data.

Optional parameters and params in lambda expressions

C# 12 allows lambda expressions to have default parameters as shown in the next example.

var prefixer = (string value, string prefix = "_")
                  => $"{prefix}{value}";

Console.WriteLine(prefixer("name"));
Console.WriteLine(prefixer("name", "$"));

We’ve used the var keyword as the target type of the lambda expression. Under the hood, the compiler will define a delegate type that stores the optional parameter values as shown in this expanded example.

// The optional values are captured in the delegate type.
delegate string PrefixerDelegate(string value, string prefixer = "_");

PrefixerDelegate prefixer = (string value, string prefix)
                               => $"{prefix}{value}";

C# 12 also allows to use params in a lambda expressions.

var adder = (params int[] numbers)
                => numbers.Sum();

int sum = adder(1, 2, 3);

Similar to the optional parameters, the params is captured in the delegate type.

Ref readonly parameters

C# 7.2 introduced the in keyword which enables passing a value by reference while not allowing the value to be modified.

MyStruct s = new MyStruct { I = 1 };

Foo(s); // or: Foo(in s);

void Foo(in MyStruct value) {
  value.I = 10; // CS8332: value is readonly
}

struct MyStruct {
  public int I;
}

As shown in the previous example, the caller is not required to use the in keyword when passing the variable. in arguments are also not limited to passing variables. As shown in the following example, we can pass temporary values that are not in scope before/after the call.

Foo(Bar());
Foo(default(MyStruct));

MyStruct Bar() { .. }

C# 12 introduces passing values as ref readonly. In contrast to in, the caller is required to specify the ref keyword. This means the latter examples are no longer allowed because the temporary values passed in are not referenceable. This allows us to better capture the semantics of some APIs, like when calling ReadOnlySpan(ref readonly T reference) as shown in the next example.

MyStruct value = default;

// Calling ReadOnlySpan(ref readonly T reference)
// allows passing a referenceable value:
var span = new ReadOnlySpan<MyStruct>(ref value);
span[0] = ..; // operates on the referenced value

// and disallows passing a non-referenceable value:
var span = new ReadOnlySpan<int>(ref CreateMyStruct()); // CS1510: ref must be an assignable variable

MyStruct CreateMyStruct() => default;

struct MyStruct
{ }

Alias any type

C# type aliases were restricted to using the full type names:

using Int = System.Int32;
using TupleOfInts = System.ValueTuple<int, int>;

While C# 12 allows us to use any C# type declarations:

using Int = int;
using TupleOfInts = (int, int);
using unsafe Pointer = int*;

UnsafeAccessorAttribute

Serializers require access to inaccessible members of types. Previously this was only achievable using reflection. .NET 8 is introducing the UnsafeAccessorAttribute which allows to do this without using reflection. This improves performance, enables source-generators to access these members, and it works well with NativeAOT.

The inaccessible members are made accessible by declaring an extern method declaration with an appropriate signature and adding the UnsafeAccessorAttribute to identify the member. The runtime will provide the implementation for these methods. If the member is not found, calling the method will throw MissingFieldException or MissingMethodException.

The following example shows calling a private constructor, and calling a private property getter.

using System.Runtime.CompilerServices;

MyClass instance = Ctor(1);
int value = GetPrivateProperty(instance);

[UnsafeAccessor(UnsafeAccessorKind.Constructor)]
extern static MyClass Ctor(int i);

[UnsafeAccessor(UnsafeAccessorKind.Method, Name = "get_PrivateProperty")]
extern static int GetPrivateProperty(MyClass c);

public class MyClass {
   MyClass(int i) { PrivateProperty = i; }
   int PrivateProperty { get ; }
}

The UnsafeAccessorAttribute documentation provides a full overview on how to access different members. Support for generic parameters is added as part of .NET 9.

Conclusion

In this second and final article on C# 12, we looked at inline arrays, optional params and params in lambda expressions, ref readonly parameters, aliasing any type, and the UnsafeAccessorAttribute. These new features improve C# for specific use cases.

The post Some more C# 12 appeared first on Red Hat Developer.

Read the whole story
alvinashcraft
5 hours ago
reply
West Grove, PA
Share this story
Delete

New MSTest SDK: Usage of MSTest Runner Extensions, Support Running Tests in Native AOT

1 Share

Microsoft announced the new MSTest SDK built on top of the MSBuild Project SDK system. This SDK improves the experience of testing with MSTest. There are such features as easier usage of MSTest Runner extensions, support running tests in Native AOT mode and better default suggestions.

By Robert Krzaczyński
Read the whole story
alvinashcraft
5 hours ago
reply
West Grove, PA
Share this story
Delete

PowerShell Universal v5.0 Beta 3

1 Share
Today, we are happy to announce the third beta of PowerShell Universal v5. Information about the previous betas can be found here: v5 Beta1 v5 Beta2 You can download the latest version of PowerShell Universal from our website. Download Now Permissions We’ve added granular permissions to PowerShell Universal. This replaces the previous access control system and builds onto existing role-based access. Permissions can be assigned to identities and roles to control access to specific features of PowerShell Universal.
Read the whole story
alvinashcraft
12 hours ago
reply
West Grove, PA
Share this story
Delete

Filters in Semantic Kernel

1 Share

It’s important to understand how the application behaves and have the ability to override that behavior in runtime based on some conditions. For example, we don’t want to send malicious prompt to LLM, and we don’t want to expose more information than needed to the end users.

A couple of months ago, we added a possibility in Semantic Kernel to handle such scenarios using Filters. This feature is an improvement over previous implementation based on event handlers and today we want to introduce even more improvements based on feedback we received from SK users!

Let’s start from current version of Filters, which should allow to understand current issues. After that, we will talk about how new filters resolve the issues and see some usage examples.

Overview of current version

Here is an example of function filter, which will be executed before and after function invocation:

public class MyFilter : IFunctionFilter
{
    public void OnFunctionInvoking(FunctionInvokingContext context)
    {
        // Method which is executed before function invocation.
    }

    public void OnFunctionInvoked(FunctionInvokedContext context)
    {
        // Method which is executed after function invocation.
    }
}

First, current IFunctionFilter interface does not support asynchronous methods. This aspect is important, because it should be possible to make additional asynchronous operations, like calling another kernel function, making a request to database or caching LLM result.

Another limitation is that it’s not possible to handle an exception that occurred during function execution and override the result. This feature would be especially useful during automatic function invocation, when LLM wants to execute a couple of functions, and in case of exception it will be possible to handle it and override the result for LLM with some default value.

While it’s good to have separate methods for each function invocation event (like in IFunctionFilter interface), this approach has a disadvantage – these methods are not connected to each other, so in order to share some state, it needs to be saved on class level. This is not necessarily a bad thing, but let’s take an example when we want to measure how much time our function executes, and we want to start measurement in OnFunctionInvoking method and stop it with sending results to telemetry tool in OnFunctionInvoked method. In this case, we will be forced to set System.Diagnostics.Stopwatch instance on class level, which is not a common pattern.

New version

We are excited to announce, that new version of Filters will resolve the problems described above.

Existing filters were renamed in order to use more specific naming. New naming works better with new type of filter, which we are going to present later in this article. New names for existing filters are the following:

•	IFunctionFilter -> IFunctionInvocationFilter
•	IPromptFilter -> IPromptRenderFilter

Also, the interface for function and prompt filters was changed – instead of having two separate methods, there is only one, which makes it easier to implement.

Function invocation filter

Here is an example of function invocation filter:

public class MyFilter : IFunctionInvocationFilter
{
    public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
    {
        // Perform some actions before function invocation
        await next(context);
        // Perform some actions after function invocation
    }
}

The method is asynchronous, which makes it easy to call other asynchronous operations using async/await pattern.

Together with context, there is also a next delegate, which executes next filter in pipeline, in case there are multiple filters registered, or function itself. If next delegate is not invoked, the next filters and function won’t be invoked as well. This provides more control, and it is useful in case there are some reasons to avoid function execution (e.g. malicious prompt or function arguments).

Another benefit of next delegate is exception handling. With this approach, it’s possible to handle exceptions in .NET-friendly way using try/catch block:

public class ExceptionHandlingFilterExample(ILogger logger) : IFunctionInvocationFilter
{
    private readonly ILogger _logger = logger;

    public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
    {
        try
        {
            await next(context);
        }
        catch (Exception exception)
        {
            this._logger.LogError(exception, "Something went wrong during function invocation");

            // Example: override function result value
            context.Result = new FunctionResult(context.Result, "Friendly message instead of exception");

            // Example: Rethrow another type of exception if needed
            // throw new InvalidOperationException("New exception");
        }
    }
}

Same set of features is available for streaming scenarios. Here is an example how to override function streaming result using IFunctionInvocationFilter:

public class StreamingFilterExample : IFunctionInvocationFilter
{
    public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
    {
        await next(context);

        // In streaming scenario, async enumerable is available in context result object.
        // To override data: get async enumerable from context result, override data and set new async enumerable in context result:
        var enumerable = context.Result.GetValue<IAsyncEnumerable<int>>();
        context.Result = new FunctionResult(context.Result, OverrideStreamingDataAsync(enumerable!));
    }

    private async IAsyncEnumerable<int> OverrideStreamingDataAsync(IAsyncEnumerable<int> data)
    {
        await foreach (var item in data)
        {
            // Example: override streaming data
            yield return item * 2;
        }
    }
}

Prompt render filter

Prompt render filters have similar signature:

public class PromptFilterExample : IPromptRenderFilter
{
    public async Task OnPromptRenderAsync(PromptRenderContext context, Func<PromptRenderContext, Task> next)
    {
        // Example: get function information
        var functionName = context.Function.Name;

        await next(context);

        // Example: override rendered prompt before sending it to AI
        context.RenderedPrompt = "Safe prompt";
    }
}

This filter is executed before prompt rendering operation, and next delegate executes next prompt filters in pipeline or prompt rendering operation itself. When next delegate is executed, it’s possible to observe rendered prompt and override it, in case we want to provide even more information (e.g. RAG scenarios) or remove sensitive information from it.

Auto function invocation filter

This is a new type of filter for automatic function invocation scenario (also known as function calling).

This filter is similar to IFunctionInvocationFilter, but it is executed in different scope, that has more information about execution. It means, that context model will also have more information, including:

  • Function name and metadata.
  • Chat history.
  • List of all functions that should be executed.
  • Request sequence index – identifies how many requests to LLM we already performed.
  • Function sequence index – identifies how many functions we already invoked as part of single request.
  • Function count – total number of functions to be executed as part of single request.

Here is a full overview of API that IAutoFunctionInvocationFilter provides:

public class AutoFunctionInvocationFilter(ILogger logger) : IAutoFunctionInvocationFilter
{
    private readonly ILogger _logger = logger;

    public async Task OnAutoFunctionInvocationAsync(AutoFunctionInvocationContext context, Func<AutoFunctionInvocationContext, Task> next)
    {
        // Example: get function information
        var functionName = context.Function.Name;

        // Example: get chat history
        var chatHistory = context.ChatHistory;

        // Example: get information about all functions which will be invoked
        var functionCalls = FunctionCallContent.GetFunctionCalls(context.ChatHistory.Last());

        // Example: get request sequence index
        this._logger.LogDebug("Request sequence index: {RequestSequenceIndex}", context.RequestSequenceIndex);

        // Example: get function sequence index
        this._logger.LogDebug("Function sequence index: {FunctionSequenceIndex}", context.FunctionSequenceIndex);

        // Example: get total number of functions which will be called
        this._logger.LogDebug("Total number of functions: {FunctionCount}", context.FunctionCount);

        // Calling next filter in pipeline or function itself.
        // By skipping this call, next filters and function won't be invoked, and function call loop will proceed to the next function.
        await next(context);

        // Example: get function result
        var result = context.Result;

        // Example: override function result value
        context.Result = new FunctionResult(context.Result, "Result from auto function invocation filter");

        // Example: Terminate function invocation
        context.Terminate = true;
    }
}

Summary

Provided examples show how to use function, prompt and auto function invocation filters. With new design, it should be possible to get more observability and have more control over function execution.

We’re always interested in hearing from you. If you have feedback, questions or want to discuss further, feel free to reach out to us and the community on the discussion boards on GitHub! We would also love your support, if you’ve enjoyed using Semantic Kernel, give us a star on GitHub.

The post Filters in Semantic Kernel appeared first on Semantic Kernel.

Read the whole story
alvinashcraft
12 hours ago
reply
West Grove, PA
Share this story
Delete
Next Page of Stories