Swift's result builders are a powerful language feature that let us create domain-specific languages right inside our Swift code. With a little thinking, this means we can actually create whole websites in Swift, with our code automatically being converted to valid, accessible Swift, and we can even sprinkle in a little SwiftUI magic to complete the effect.
Let's get to it…
Starting from scratch
For over 30 years, HTML has been a great language for describing the structure of web pages.
For example, we can write HTML like this:
<h1>Wise words</h1>
<p>"If you don't take risks, you can't create a future" - <em>Monkey D. Luffy</em> in <a href="https://en-wp.org/wiki/One_Piece">One Piece</a></p>
That has a heading, a paragraph of text with some emphasis, and a link to another page. But, what happens if we forget the closing </em> tag? Without it, web browsers will assume everything that follows should also be emphasized.
That's not what I intended, but it's easy to do because HTML is just a bunch of text.
But even if you write perfect HTML, there are other, bigger problems:
How can you make sure your pages look the same on all browsers?
How can you make your page adapt to different screen sizes, such as iPhone and iPad?
How can you use more advanced UI elements such as dropdown menus, carousels, and accordions?
Most importantly, how can you be sure your site is accessible to everyone?
Ultimately, all these boil down to one huge problem: most people don't have enough time to become experts in Swift and also experts in HTML.
And so I want to suggest that the answer is to not use HTML, or at least not directly. Instead, I would you like to propose that we use Swift to build our websites.
Introducing result builders
Back in 2019 when Apple announced SwiftUI there the usual What's New in Swift presentation. During that talk they showed the following HTML:
Why use Continuous integration and deployment with WordPress?
CI/CD is a system that automates steps in software delivery. For WordPress developers, it means less manual work: once you push updates, the system automatically tests and deploys them. It's like having a assistant that not only speeds up your work but also checks for errors with each update, ensuring your WordPress site runs smoothly. This constant testing and feedback loop means you can fix bugs quickly and improve your site continuously without disrupting the live version. In short, CI/CD makes your development process faster, more efficient, and less error prone.
How to get started with Continuous integration and deployment in WordPress on App Service?
Part 1: Before integrating code with Git, it is important to decide which files to include. It is recommended that you keep track of a smaller number of files. For example, keeping track of files in wp-content/uploads is inefficient as it might contain large static files. Instead, files like these must be stored on blob storage. Another example is wp_config.php file, since this file contains separate settings for development and production environments.
You should also choose to ignore WordPress core files if you are not making any changes to them.
Build a chatbot service to ensure safe conversations: Using Azure OpenAI & Azure Content Safety
Why should we care about the safety of our chat service?
When you deploy a chat service on a website, users may enter inappropriate or harmful messages that can lead to unwanted responses from the chatbot. This can pose significant risks, including the potential for revealing sensitive company information or allowing other users' messages to influence the chatbot's responses. That's why it's critical to implement a content filtering mechanism that can inspect messages from users. Azure Content Safety provides a robust solution for inspecting and filtering inappropriate content, ensuring safe and secure interactions. This tutorial is ideal for anyone who wants to build a chat service with strong content moderation capabilities. In this tutorial, you will learn how to build a chatbot service that interacts with users using Azure Cosmos DB, Azure Content Safety, and Azure OpenAI. This service provides the following features:
Analyze user messages for safety: Analyze messages entered by users using Azure Content Safety to evaluate them for hate, self-harm, sexual content, and violence.
Conversations with chatbot: Conduct conversations about safe messages using Azure OpenAI.
Manage conversation history: Store a user's conversation history in Azure Cosmos DB and load or clear the history as needed.
To create a service to detect inappropriate content, you need to create an Azure Content Safety resource.
In this exercise, you will:
Create an Azure Content Safety to detect inappropriate content.
Create an Azure Content Safety resource
Typecontent safetyin thesearch barat the top of the portal page and selectContent safetyfrom the options that appear.
Select+ Createfrom the navigation menu.
Perform the following tasks:
Select your AzureSubscription.
Select theResource groupto use (create a new one if needed).
Select theRegionyou'd like to use.
EnterContent safety name. It must be a unique value.
Select theFree F0pricing tier.
SelectReview + Create.
SelectCreate.
Create an Azure OpenAI
To enable your chat service to provide answers based on chat history stored in Azure Cosmos DB, you need to create and deploy an Azure OpenAI resource.
In this exercise, you will:
Create an Azure OpenAI resource.
Deploy Azure OpenAI models.
Note
Access to the Azure OpenAI service is currently available by request. To request access, please fill out the form on theAzure OpenAI request page.
Create an Azure OpenAI resource
Typeazure openaiin thesearch barat the top of the portal page and selectAzure OpenAIfrom the options that appear.
Select+ Createfrom the navigation menu.
Perform the following tasks:
Select your AzureSubscription.
Select theResource groupto use (create a new one if needed).
Select theRegionyou'd like to use.
Enter Azure OpenAIName. It must be a unique value.
Select theStandard S0pricing tier.
Note
To minimize costs, try to create all the resources in the same region.
SelectNextto move to theNetworkpage.
Select a network securityType.
SelectNextto move to theTagspage.
SelectNextto move to theReview + submitpage.
SelectCreate.
Deploy Azure OpenAI models
Navigate to the Azure OpenAI resource that you created.
SelectGo to Azure OpenAI Studiofrom the navigation menu.
Inside Azure OpenAI Studio, selectDeploymentsfrom the left side tab.
Select+ Create new deploymentfrom the navigation menu to create a newgpt-35-turbodeployment.
Perform the following tasks:
For the model, selectgpt-35-turbo.
For theModel version, selectDefault.
For theDeployment name, add a name that's unique to this cloud instance. For example,gpt-35-turbo.
SelectCreate.
Now you've learned how to set up Azure resources to implement features that allow the Azure Content Safety resource to analyze conversations and Azure OpenAI to generate responses. In the next exercise, you will develop a Python program that interacts with users to ensure safe conversations.
Set up the project and install the libraries
Now, you will create a folder to work in and set up a virtual environment to develop a program.
In this exercise, you will
Create a folder to work inside it.
Create a virtual environment.
Install the required packages.
Create a folder to work inside it
Open a terminal window and type the following command to create a folder namedsafety-chatbotin the default path.
mkdir safety-chatbot
Type the following command inside your terminal to navigate to thesafety-chatbotfolder you created.
cd safety-chatbot
Create a virtual environment
Type the following command inside your terminal to create a virtual environment named.venv.
python -m venv .venv
Type the following command inside your terminal to activate the virtual environment.
.venv\Scripts\activate.bat
Note
If it worked, you should see(.venv)before the command prompt.
Install the required packages
Type the following commands inside your terminal to install the required packages.
To develop a program that uses the Azure resources that you created, you needconfig.pyfile to enter Azure information.
In this exercise, you will:
Create anexample.pyfile.
Import the required packages.
Create aconfig.pyfile to enter Azure information.
Set upexample.pyfile
OpenVisual Studio Code.
SelectFilefrom the menu bar.
SelectOpen Folder.
Select thesafety-chatbotfolder that you created, which is located atC:\Users\yourUserName\safety-chatbot.
In the left pane of Visual Studio Code, right-click and selectNew Fileto create a new file namedexample.py.
Add the following code to theexample.pyfile to import the required libraries.
# Library imports
import asyncio
from datetime import datetime, timezone
from concurrent.futures import ThreadPoolExecutor
# Azure imports
from azure.ai.contentsafety.aio import ContentSafetyClient
from azure.ai.contentsafety.models import AnalyzeTextOptions, TextCategory
from azure.core.credentials import AzureKeyCredential
from azure.core.exceptions import HttpResponseError
from azure.cosmos import CosmosClient
# Semantic Kernel imports
import semantic_kernel as sk
import semantic_kernel.connectors.ai.open_ai as sk_oai
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
from semantic_kernel.prompt_template import PromptTemplateConfig
from semantic_kernel.prompt_template.input_variable import InputVariable
from semantic_kernel.functions.kernel_arguments import KernelArguments
Set upconfig.pyfile
In the left pane of Visual Studio Code, right-click and selectNew Fileto create a new file namedconfig.py.
In this tutorial, you use the 2024-02-15-preview version of the Azure OpenAI API.
Copy and paste your Azure OpenAI API versions into theconfig.pyfile.
Set up and initialize the ChatbotService class with Azure Services
To implement the ChatbotService for handling chat interactions and Azure services, you need to import the Azure information from theconfig.pyfile into theexample.pyfile and implement a class that efficiently orchestrates these functionalities.
In this exercise, you will:
Import Azure information from theconfig.pyfile into theexample.pyfile.
Create the ChatbotService class to efficiently manage chat interactions and integrate Azure service capabilities.
Note
The complete code for this tutorial is provided at the end to make it easy to connect the pieces and understand the overall implementation.
Set up and initialize the ChatbotService class with Azure Services
To implement the ChatbotService for handling chat interactions and Azure services, you need to import the Azure information from theconfig.pyfile into theexample.pyfile and implement a class that efficiently orchestrates these functionalities.
In this exercise, you will:
Import Azure information from theconfig.pyfile into theexample.pyfile.
Create the ChatbotService class to efficiently manage chat interactions and integrate Azure service capabilities.
Note
The complete code for this tutorial is provided at the end to make it easy to connect the pieces and understand the overall implementation.
Import Azure information from theconfig.pyfile
Add the following code to theexample.pyfile to import the values fromconfig.pyfile.
Add the following code to theexample.pyfile to set up theChatbotServiceclass. This class initializes Azure Cosmos DB for NoSQL, which is a NoSQL database service that handles database operations synchronously.
class ChatbotService:
def __init__(self, loop, user_id):
"""
Initialize the ChatbotService with event loop, user ID, and necessary Azure CosmosDB clients.
"""
# Set up the event loop and thread pool executor for managing asynchronous tasks,
# allowing Azure Cosmos DB operations to be handled in an asynchronous environment
self.loop = loop
self.executor = ThreadPoolExecutor()
# Store the user ID, which is used as the partition key (/userId) in Azure CosmosDB
self.user_id = user_id
# Initialize the Azure Cosmos DB client
self.cosmos_client = CosmosClient(AZURE_COSMOSDB_ENDPOINT, credential=AZURE_COSMOSDB_KEY)
# Initialize the Azure Cosmos DB database client
self.database = self.cosmos_client.get_database_client(AZURE_COSMOSDB_DATABASE_NAME)
# Initialize the Azure Cosmos DB container client
self.container = self.database.get_container_client(AZURE_COSMOSDB_CONTAINER_NAME)
Add the following code to theexample.pyfile to initialize the Azure Content Safety client and the Semantic Kernel within theChatbotServiceclass. This step sets up the asynchronous operations required for Content Safety analysis and chatbot interactions.
async def init(self):
"""
Initialize the Content Safety client and Semantic Kernel.
"""
# Initialize the Azure Content Safety client
self.content_safety_client = ContentSafetyClient(AZURE_CONTENT_SAFETY_ENDPOINT, AzureKeyCredential(AZURE_CONTENT_SAFETY_KEY))
# Initialize the Semantic Kernel
self.kernel = sk.Kernel()
# Initialize the chat service for Azure OpenAI
self.chat_service = AzureChatCompletion(
service_id='chat_service',
deployment_name=AZURE_OPENAI_CHAT_DEPLOYMENT_NAME,
endpoint=AZURE_OPENAI_ENDPOINT,
api_key=AZURE_OPENAI_KEY
)
# Add the chat service to the Semantic Kernel
self.kernel.add_service(self.chat_service)
# Define the prompt template configuration for the chatbot
self.prompt_template_config = PromptTemplateConfig(
template="""ChatBot can have a conversation with you about any topic.
It can give explicit instructions or say 'I don't know' if it does not have an answer.
{{$history}}
User: {{$user_message}}
ChatBot: """,
name='chat_prompt_template',
template_format='semantic-kernel',
input_variables=[
InputVariable(name='user_message', description='The user message.', is_required=True),
InputVariable(name='history', description='The conversation history', is_required=True),
],
execution_settings=sk_oai.OpenAIChatPromptExecutionSettings(
service_id='chat_service',
ai_model_id='gpt-3.5-turbo',
max_tokens=500,
temperature=0.7
)
)
# Add the chat function to the Semantic Kernel
self.chat_function = self.kernel.add_function(
function_name="chat_function",
plugin_name="chat_plugin",
prompt_template_config=self.prompt_template_config,
)
return self
Note
Why is the Constructor Split?
When setting up the ChatbotService, the constructor is split into two parts:__init__andasync init. This separation is critical for effectively managing both synchronous and asynchronous operations required by different Azure services. Azure Cosmos DB for NoSQL operates in a synchronous environment, while Azure Content Safety and the Semantic Kernel benefit from asynchronous operations. This requires a clear distinction between synchronous and asynchronous initialization.
Implement core functions in ChatbotService
To implement the core functions of the ChatbotService for handling chat interactions and integrating with Azure services, you need to add several methods to the class. These methods enable text analysis, chatbot interactions, and conversation history management.
In this exercise, you will:
Implement a method to analyze text for safety using Azure Content Safety.
Implement a method to interact with the chatbot using the Semantic Kernel.
Implement methods to store, load, and clear the conversation history in Azure Cosmos DB.
Add the following code to theChatbotServiceclass to create theanalyze_textfunction that analyze the input text for safety using Azure Content Safety.
async def analyze_text(self, text):
"""
Analyze the input text for safety using Azure Content Safety.
"""
# Create a request with the input text to be analyzed
request = AnalyzeTextOptions(text=text)
try:
# Send the request to the Azure Content Safety client and await the response
response = await self.content_safety_client.analyze_text(request)
# Get the analysis results for different categories
results = {
'hate': next((item for item in response.categories_analysis if item.category == TextCategory.HATE), None),
'self_harm': next((item for item in response.categories_analysis if item.category == TextCategory.SELF_HARM), None),
'sexual': next((item for item in response.categories_analysis if item.category == TextCategory.SEXUAL), None),
'violence': next((item for item in response.categories_analysis if item.category == TextCategory.VIOLENCE), None),
}
# Print content safety analysis results
print("\n<-- Content Safety Analysis Results -->")
for category, result in results.items():
if result:
print(f"{category.capitalize()} severity: {result.severity}")
print("<-- End of Content Safety Analysis Results -->\n")
# Define a threshold for the text to be considered unsafe
threshold = 2
# Based on the threshold, determine if the text is safe
is_safe = not any(result and result.severity >= threshold for result in results.values())
return is_safe, results
except HttpResponseError as e:
# Handle any HTTP response errors that occur during the request
print(f"Failed to analyze text. Error: {e}")
Add the following code to theChatbotServiceclass to create thechat_with_kernelfunction that interacts with the chatbot using the semantic kernel and history provided.
async def chat_with_kernel(self, user_message, history):
"""
Interact with the chatbot using the Semantic Kernel and provided history.
"""
# Create arguments for the chat function using the user message and conversation history stored in Azure Cosmos DB
arguments = KernelArguments(user_message=user_message, history=history)
# Invoke the chat function in the Semantic Kernel with the provided arguments and await the response
response = await self.kernel.invoke(self.chat_function, arguments)
# Return the chatbot's response
return response
Add the following code to theChatbotServiceclass to create thestore_interactionand_store_interaction_syncfunctions that store user interactions with the chatbot in Azure Cosmos DB.
async def store_interaction(self, user_message, chat_response):
"""
Store the user interaction with the chatbot in Azure Cosmos DB.
"""
# Run the _store_interaction_sync method in an asynchronous execution environment
await self.loop.run_in_executor(self.executor, self._store_interaction_sync, user_message, chat_response)
def _store_interaction_sync(self, user_message, chat_response):
"""
Synchronously store the interaction in Azure Cosmos DB.
"""
# Get the current time in UTC
current_time = datetime.now(timezone.utc)
# Upsert (insert or update) the interaction data into the Cosmos DB container
self.container.upsert_item({
'id': str(current_time.timestamp()), # Use the current timestamp as a unique ID
'user_message': user_message, # Store the user message
'bot_response': chat_response, # Store the chatbot response
'timestamp': current_time.isoformat(), # Store the timestamp in ISO format
'userId': self.user_id # Store the user ID for partition key
})
Add the following code to theChatbotServiceclass to create theload_historical_contextand_load_historical_context_syncfunctions that load the user's chat history from Azure Cosmos DB.
async def load_historical_context(self):
"""
Load the user's chat history from Azure Cosmos DB.
"""
# Run the _load_historical_context_sync method in an asynchronous execution environment
return await self.loop.run_in_executor(self.executor, self._load_historical_context_sync)
def _load_historical_context_sync(self):
"""
Synchronously load the user's chat history from Azure Cosmos DB.
"""
# Define the query to select items for the current user, ordered by timestamp
query = "SELECT * FROM c WHERE c.userId = @userId ORDER BY c.timestamp DESC"
parameters = [{"name": "@userId", "value": self.user_id}]
# Execute the query and retrieve the items
items = list(self.container.query_items(query=query, parameters=parameters, enable_cross_partition_query=True))
# Include only the last 5 conversations
history_items = items[:5]
# Format the conversion history
return "\n".join([f"User: {item['user_message']}\nChatBot: {item['bot_response']}" for item in history_items])
Add the following code to theChatbotServiceclass to create theclear_historical_contextand_clear_historical_context_syncfunctions that clear the user's chat history from Azure Cosmos DB.
async def clear_historical_context(self):
"""
Clear the user's chat history from Azure Cosmos DB.
"""
# Run the _clear_historical_context_sync method in an asynchronous execution environment
await self.loop.run_in_executor(self.executor, self._clear_historical_context_sync)
def _clear_historical_context_sync(self):
"""
Synchronously clear the user's chat history from Azure Cosmos DB.
"""
# Define the query to select items for the current user
query = "SELECT * FROM c WHERE c.userId = @userId"
parameters = [{"name": "@userId", "value": self.user_id}]
# Execute the query and retrieve the items
items = list(self.container.query_items(query=query, parameters=parameters, enable_cross_partition_query=True))
# Clear the chat history by deleting all items for the current user
for item in items:
self.container.delete_item(item, partition_key=self.user_id)
Run the main function and test the ChatbotService
In this exercise, you will implement the main function that manages user interactions, analyzes and stores conversation history using Azure Cosmos DB, Azure Content Safety, Azure OpenAI, and the Semantic Kernel.
In this exercise, you will:
Create the main function to manage user interactions and perform safety checks.
Run the program to see if it works well.
Add the following code to create the main function in theexample.pyfile that handles user inputs, manages chat history, and interacts with the chatbot service.
async def main():
"""
Main function to run the chatbot service.
"""
# Enter the User ID to join the chat.
# A conversation history is stored based on the user ID.
user_id = input("User ID: ")
# Get the event loop to allow synchronous operations in Azure Cosmos DB for NoSQL to run asynchronously.
loop = asyncio.get_running_loop()
# Initialize the ChatbotService with the event loop and user ID
chatbot_service = await ChatbotService(loop, user_id).init()
# Main loop for the interaction with the user
while True:
user_message = input("You: ")
# Exit the chat loop if the user types 'exit'
if user_message.lower() == 'exit':
break
elif user_message.lower() == 'history':
# Load and print the chat history if the user types 'history'
history = await chatbot_service.load_historical_context()
print(f"\n<-- Chat history of user ID: {user_id} -->")
print(history)
print(f"<-- End of chat history of user ID: {user_id} -->\n")
elif user_message.lower() == 'clear':
# Clear the chat history if the user types 'clear'
await chatbot_service.clear_historical_context()
print("Chat history cleared.")
else:
# Analyze the text for safety
is_safe, _ = await chatbot_service.analyze_text(user_message)
if is_safe:
# Load chat history and interact with the chatbot if the message is safe
history = await chatbot_service.load_historical_context()
chat_response = await chatbot_service.chat_with_kernel(user_message, history)
print("Chatbot:", chat_response)
# Store the interaction in Cosmos DB
await chatbot_service.store_interaction(user_message, str(chat_response))
else:
# Inform the user if their message is not safe
print("Chatbot: Your message is not safe to process. Please rephrase and try again.")
if __name__ == "__main__":
asyncio.run(main())
Type the following command inside your terminal to run the program and see if it can answer questions.
python example.py
This will start the chatbot service, prompting you to enter a User ID and interact with the chatbot. You can enterexitto end the session,historyto view past interactions, andclearto current user's chat history. Here's an example of the results.
Congratulations!
You've completed this tutorial
Congratulations! You've successfully learned how to integrate Azure Content Safety with Azure OpenAI. In this tutorial, you have navigated through a practical journey of integrating Azure Cosmos DB, Azure Content Safety, and Azure OpenAI to create a robust chatbot service. By leveraging these Azure services, you now have a chatbot that can safely and effectively interact with users, analyze and store conversations, and ensure content safety.
Clean Up Azure Resources
Cleanup your Azure resources to avoid additional charges to your account. Go to the Azure portal and delete the following resources:
# Library imports
import asyncio
from datetime import datetime, timezone
from concurrent.futures import ThreadPoolExecutor
# Azure imports
from azure.ai.contentsafety.aio import ContentSafetyClient
from azure.ai.contentsafety.models import AnalyzeTextOptions, TextCategory
from azure.core.credentials import AzureKeyCredential
from azure.core.exceptions import HttpResponseError
from azure.cosmos import CosmosClient
# Semantic Kernel imports
import semantic_kernel as sk
import semantic_kernel.connectors.ai.open_ai as sk_oai
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
from semantic_kernel.prompt_template import PromptTemplateConfig
from semantic_kernel.prompt_template.input_variable import InputVariable
from semantic_kernel.functions.kernel_arguments import KernelArguments
# Configuration imports
from config import (
AZURE_COSMOSDB_DATABASE_NAME,
AZURE_COSMOSDB_CONTAINER_NAME,
AZURE_COSMOSDB_ENDPOINT,
AZURE_COSMOSDB_KEY,
AZURE_CONTENT_SAFETY_ENDPOINT,
AZURE_CONTENT_SAFETY_KEY,
AZURE_OPENAI_ENDPOINT,
AZURE_OPENAI_KEY,
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME
)
class ChatbotService:
def __init__(self, loop, user_id):
"""
Initialize the ChatbotService with event loop, user ID, and necessary Azure CosmosDB clients.
"""
# Set up the event loop and thread pool executor for managing asynchronous tasks,
# allowing Azure Cosmos DB operations to be handled in an asynchronous environment
self.loop = loop
self.executor = ThreadPoolExecutor()
# Store the user ID, which is used as the partition key (/userId) in Azure CosmosDB
self.user_id = user_id
# Initialize the Azure Cosmos DB client
self.cosmos_client = CosmosClient(AZURE_COSMOSDB_ENDPOINT, credential=AZURE_COSMOSDB_KEY)
# Initialize the Azure Cosmos DB database client
self.database = self.cosmos_client.get_database_client(AZURE_COSMOSDB_DATABASE_NAME)
# Initialize the Azure Cosmos DB container client
self.container = self.database.get_container_client(AZURE_COSMOSDB_CONTAINER_NAME)
async def init(self):
"""
Initialize the Content Safety client and Semantic Kernel.
"""
# Initialize the Azure Content Safety client
self.content_safety_client = ContentSafetyClient(AZURE_CONTENT_SAFETY_ENDPOINT, AzureKeyCredential(AZURE_CONTENT_SAFETY_KEY))
# Initialize the Semantic Kernel
self.kernel = sk.Kernel()
# Initialize the chat service for Azure OpenAI
self.chat_service = AzureChatCompletion(
service_id='chat_service',
deployment_name=AZURE_OPENAI_CHAT_DEPLOYMENT_NAME,
endpoint=AZURE_OPENAI_ENDPOINT,
api_key=AZURE_OPENAI_KEY
)
# Add the chat service to the Semantic Kernel
self.kernel.add_service(self.chat_service)
# Define the prompt template configuration for the chatbot
self.prompt_template_config = PromptTemplateConfig(
template="""ChatBot can have a conversation with you about any topic.
It can give explicit instructions or say 'I don't know' if it does not have an answer.
{{$history}}
User: {{$user_message}}
ChatBot: """,
name='chat_prompt_template',
template_format='semantic-kernel',
input_variables=[
InputVariable(name='user_message', description='The user message.', is_required=True),
InputVariable(name='history', description='The conversation history', is_required=True),
],
execution_settings=sk_oai.OpenAIChatPromptExecutionSettings(
service_id='chat_service',
ai_model_id='gpt-3.5-turbo',
max_tokens=500,
temperature=0.7
)
)
# Add the chat function to the Semantic Kernel
self.chat_function = self.kernel.add_function(
function_name="chat_function",
plugin_name="chat_plugin",
prompt_template_config=self.prompt_template_config,
)
return self
async def analyze_text(self, text):
"""
Analyze the input text for safety using Azure Content Safety.
"""
# Create a request with the input text to be analyzed
request = AnalyzeTextOptions(text=text)
try:
# Send the request to the Azure Content Safety client and await the response
response = await self.content_safety_client.analyze_text(request)
# Get the analysis results for different categories
results = {
'hate': next((item for item in response.categories_analysis if item.category == TextCategory.HATE), None),
'self_harm': next((item for item in response.categories_analysis if item.category == TextCategory.SELF_HARM), None),
'sexual': next((item for item in response.categories_analysis if item.category == TextCategory.SEXUAL), None),
'violence': next((item for item in response.categories_analysis if item.category == TextCategory.VIOLENCE), None),
}
# Print content safety analysis results
print("\n<-- Content Safety Analysis Results -->")
for category, result in results.items():
if result:
print(f"{category.capitalize()} severity: {result.severity}")
print("<-- End of Content Safety Analysis Results -->\n")
# Define a threshold for the text to be considered unsafe
threshold = 2
# Based on the threshold, determine if the text is safe
is_safe = not any(result and result.severity >= threshold for result in results.values())
return is_safe, results
except HttpResponseError as e:
# Handle any HTTP response errors that occur during the request
print(f"Failed to analyze text. Error: {e}")
async def chat_with_kernel(self, user_message, history):
"""
Interact with the chatbot using the Semantic Kernel and provided history.
"""
# Create arguments for the chat function using the user message and conversation history stored in Azure Cosmos DB
arguments = KernelArguments(user_message=user_message, history=history)
# Invoke the chat function in the Semantic Kernel with the provided arguments and await the response
response = await self.kernel.invoke(self.chat_function, arguments)
# Return the chatbot's response
return response
async def store_interaction(self, user_message, chat_response):
"""
Store the user interaction with the chatbot in Azure Cosmos DB.
"""
# Run the _store_interaction_sync method in an asynchronous execution environment
await self.loop.run_in_executor(self.executor, self._store_interaction_sync, user_message, chat_response)
def _store_interaction_sync(self, user_message, chat_response):
"""
Synchronously store the interaction in Azure Cosmos DB.
"""
# Get the current time in UTC
current_time = datetime.now(timezone.utc)
# Upsert (insert or update) the interaction data into the Cosmos DB container
self.container.upsert_item({
'id': str(current_time.timestamp()), # Use the current timestamp as a unique ID
'user_message': user_message, # Store the user message
'bot_response': chat_response, # Store the chatbot response
'timestamp': current_time.isoformat(), # Store the timestamp in ISO format
'userId': self.user_id # Store the user ID for partition key
})
async def load_historical_context(self):
"""
Load the user's chat history from Azure Cosmos DB.
"""
# Run the _load_historical_context_sync method in an asynchronous execution environment
return await self.loop.run_in_executor(self.executor, self._load_historical_context_sync)
def _load_historical_context_sync(self):
"""
Synchronously load the user's chat history from Azure Cosmos DB.
"""
# Define the query to select items for the current user, ordered by timestamp
query = "SELECT * FROM c WHERE c.userId = @userId ORDER BY c.timestamp DESC"
parameters = [{"name": "@userId", "value": self.user_id}]
# Execute the query and retrieve the items
items = list(self.container.query_items(query=query, parameters=parameters, enable_cross_partition_query=True))
# Include only the last 5 conversations
history_items = items[:5]
# Format the conversion history
return "\n".join([f"User: {item['user_message']}\nChatBot: {item['bot_response']}" for item in history_items])
async def clear_historical_context(self):
"""
Clear the user's chat history from Azure Cosmos DB.
"""
# Run the _clear_historical_context_sync method in an asynchronous execution environment
await self.loop.run_in_executor(self.executor, self._clear_historical_context_sync)
def _clear_historical_context_sync(self):
"""
Synchronously clear the user's chat history from Azure Cosmos DB.
"""
# Define the query to select items for the current user
query = "SELECT * FROM c WHERE c.userId = @userId"
parameters = [{"name": "@userId", "value": self.user_id}]
# Execute the query and retrieve the items
items = list(self.container.query_items(query=query, parameters=parameters, enable_cross_partition_query=True))
# Clear the chat history by deleting all items for the current user
for item in items:
self.container.delete_item(item, partition_key=self.user_id)
async def main():
"""
Main function to run the chatbot service.
"""
# Enter the User ID to join the chat.
# A conversation history is stored based on the user ID.
user_id = input("User ID: ")
# Get the event loop to allow synchronous operations in Azure Cosmos DB for NoSQL to run asynchronously.
loop = asyncio.get_running_loop()
# Initialize the ChatbotService with the event loop and user ID
chatbot_service = await ChatbotService(loop, user_id).init()
# Main loop for the interaction with the user
while True:
user_message = input("You: ")
# Exit the chat loop if the user types 'exit'
if user_message.lower() == 'exit':
break
elif user_message.lower() == 'history':
# Load and print the chat history if the user types 'history'
history = await chatbot_service.load_historical_context()
print(f"\n<-- Chat history of user ID: {user_id} -->")
print(history)
print(f"<-- End of chat history of user ID: {user_id} -->\n")
elif user_message.lower() == 'clear':
# Clear the chat history if the user types 'clear'
await chatbot_service.clear_historical_context()
print("Chat history cleared.")
else:
# Analyze the text for safety
is_safe, _ = await chatbot_service.analyze_text(user_message)
if is_safe:
# Load chat history and interact with the chatbot if the message is safe
history = await chatbot_service.load_historical_context()
chat_response = await chatbot_service.chat_with_kernel(user_message, history)
print("Chatbot:", chat_response)
# Store the interaction in Cosmos DB
await chatbot_service.store_interaction(user_message, str(chat_response))
else:
# Inform the user if their message is not safe
print("Chatbot: Your message is not safe to process. Please rephrase and try again.")
if __name__ == "__main__":
asyncio.run(main())
What are cookies? Websites send small pieces of text, known as internet cookies, to our browsers whenever we visit them. They help us have a personalized experience on that particular website and remember it for the next time we visit it. Our goal today is how to make that happen in ASP.NET Core. In this […]
On this week's episode of the podcast, freeCodeCamp I interview The Primeagean. He's a software engineer who streams himself programming. He recently left his job at Netflix to stream full-time. We talk about: Prime's journey from his teacher tellin...
Indeed, it wasn’t that hard. Of course, I had to fight with ESModules quirks, but I also expanded the scope a bit, as I also decided to use one of the most underrated Docker Compose features: Profiles.
What are Docker Compose Profiles?
They’re a way to logically group services inside your Docker Compose definition, allowing you to run a subset of services. My original Docker Compose definition contained the EventStoreDB startup, which I use in my Emmett samples as the real event store example.
Nothing fancy here so far. You can just run it with:
docker compose up
It will start the database; then, you can run a sample application with
npm run start
And play with Emmett.
I wanted to keep the sample experience straightforward and use local development/debugging as the default. Docker image build and run would be optional (we could call it “Erik mode”!).
Now, profiles come in handy here, as they enable that, I just had to add:
Thanks to that, we’re saying that this service will only be used if we explicitly specify that in the command line. We can, for instance build the image by running
docker compose --profile app build
Or run both EventStoreDB and Emmett WebApi by calling:
docker compose --profile app up
And let’s stop here for a moment! Why both if I specified the app profile? Docker Compose will run, in this case, specified profile AND all services that don’t have a profile specified. That’s quite neat, as we can define the set of default services (e.g. databases, messaging systems, etc.) and add others as optional. Ah, and you can specify multiple profiles by, e.g.:
docker compose --profile backend --profile frontend up
You can also group multiple services into a single profile. Why would you do it? Let’s go to…
Docker Profiles advanced scenario
In my Event Sourcing .NET samples repository, I’m trying to cover multiple aspects, tools, ways to build Event Sourcing, CQRS and Event-Driven systems. I’m using:
Marten (so Postgres) and EventStoreDB as example event stores,
Postgres and Elasticsearch as read model stores,
Kafka as a messaging system to show the integration between services,
UI tools like PgAdmin and Kafka UI to easier investigate sample data.
Multiple samples are using those services in various configurations.
Initially, I kept multiple Docker Compose files for:
default configuration with all services,
continuous integration pipeline configuration without UI components, as they’re not needed for tests. They’d just eat resources and make pipeline runs longer. They also don’t have Kafka, as I’m just testing inner modules functionalities,
sample web API Docker Image build (similar to the one explained above),
only containing Postgres-related configurations,
accordingly, only with EventStoreDB,
etc.
I’m sure that you can relate that to your projects. Now, how can Docker Compose Profiles help us with that? It could definitely help us merge multiple configurations into one and easier manage updating versions, etc.
Let’s see the config I ended up with and then explain the reasoning. I’ll trim the detailed service configuration; you can check the whole file here.
version:"3"services:######################################################## Postgres#######################################################postgres:profiles:[ postgres, postgres-all, all, all-no-ui, ci ]image: postgres:15.1-alpine
# (...) rest of the configpgadmin:profiles:[ postgres-ui, postgres-all, all ]image: dpage/pgadmin4
# (...) rest of the configjaeger:
image: jaegertracing/all-in-one:latest
profiles:[ otel, otel-all, all ]# (...) rest of the config######################################################## EventStoreDB#######################################################eventstore.db:
image: eventstore/eventstore:23.10.0-bookworm-slim
profiles:[ eventstoredb, eventstoredb-all, all, all-no-ui, ci ]# (...) rest of the config######################################################## Elastic Search#######################################################elasticsearch:image: docker.elastic.co/elasticsearch/elasticsearch:8.13.2
profiles:[ elastic, elastic-all, all, all-no-ui, ci ]# (...) rest of the configkibana:image: docker.elastic.co/kibana/kibana:8.13.2
profiles:[ elastic-ui, elastic-all, all ]# (...) rest of the config######################################################## Kafka#######################################################kafka:image: confluentinc/confluent-local:7.6.1
profiles:[kafka, kafka-all, all, all-no-ui]# (...) rest of the configinit-kafka:image: confluentinc/confluent-local:7.6.1
profiles:[ kafka, kafka-all, all, all-no-ui ]command:"#shell script to setup Kafka topics"# (...) rest of the config######################################################## Schema Registry#######################################################schema_registry:image: confluentinc/cp-schema-registry:7.6.1
profiles:[ kafka-ui, kafka-all, all ]# (...) rest of the configkafka_topics_ui:image: provectuslabs/kafka-ui:latest
profiles:[ kafka-ui, kafka-all, all ]depends_on:- kafka
# (...) rest of the config######################################################## Test Backend Service#######################################################backend:build:dockerfile: Dockerfile
context: .
profiles:[build]# (...) rest of the config## (...) Network and Volumes config
As you see, we have a few general profiles:
postgres
elastic
kafka
eventstoredb
otel
build
They group the needed tooling containers.
Each of them has the additional profiles with prefixes:
{profile}-all (e.g. postgres-all) - will start all needed tooling containers plus supportive like ui,
{profile}-all-no-ui - will start just the needed tooling without UI components. There’s no {profile}-all-ui, as starting UI without actual components doesn’t make sense.
I also defined additional profiles:
all - that’ll run all components,
ci - only components needed for the CI pipeline (so no UI and Kafka).
So by default, if I don’t mind my RAM being eaten by all containers, I’d run:
docker compose --profile all up
If I’d like to run the Marten sample with Elasticsearch read models, I could just run:
docker compose --profile postgres --profile elastic up
In the CI, I can run:
docker compose --profile ci up
It’s important to find balance and conventions for profile names. If you have too many of them, it’ll be challenging for people to memorise all of them. That’s why grouping them and adding standard conventions can be helpful. We should always consider intended usage and make it accessible. I could potentially provide profiles for dedicated samples.
p.s. Ukraine is still under brutal Russian invasion. A lot of Ukrainian people are hurt, without shelter and need help. You can help in various ways, for instance, directly helping refugees, spreading awareness, putting pressure on your local government or companies. You can also support Ukraine by donating e.g. to Red Cross, Ukraine humanitarian organisation or donate Ambulances for Ukraine.