Learn how to create a function triggered when data is added to or changed in Azure Cosmos DB. To learn more about Azure Cosmos DB, see Azure Cosmos DB: Serverless database computing using Azure Functions.
Trigger, DataTrigger & EventTrigger. So far, we worked with styles by setting a static value for a specific property. However, using triggers, you can change the value of a given property, once a certain condition changes. Triggers come in multiple flavors: Property triggers, event triggers and data triggers. Hi, Button event is always trigger when you click the button but if you want to raise your button click event manually from code then you can use Button.PerformClick from your code. Suppose on click of your Button1 you want to trigger Button2 click event then Please refer this MSDN example. Let me know if you have any doubt.
To complete this tutorial:
Note
Azure Cosmos DB bindings are only supported for use with the SQL API. For all other Azure Cosmos DB APIs, you should access the database from your function by using the static client for your API, including Azure Cosmos DB's API for MongoDB, Cassandra API, Gremlin API, and Table API.
You must have an Azure Cosmos DB account that uses the SQL API before you create the trigger.
Sign in to the Azure portal.
Select Create a resource > Databases > Azure Cosmos DB.
On the Create Azure Cosmos DB Account page, enter the basic settings for the new Azure Cosmos account.
Setting | Value | Description |
---|---|---|
Subscription | Subscription name | Select the Azure subscription that you want to use for this Azure Cosmos account. |
Resource Group | Resource group name | Select a resource group, or select Create new, then enter a unique name for the new resource group. |
Account Name | Enter a unique name | Enter a name to identify your Azure Cosmos account. Because documents.azure.com is appended to the ID that you provide to create your URI, use a unique ID. The ID can only contain lowercase letters, numbers, and the hyphen (-) character. It must be between 3-31 characters in length. |
API | Core (SQL) | The API determines the type of account to create. Azure Cosmos DB provides five APIs: Core (SQL) and MongoDB for document data, Gremlin for graph data, Azure Table, and Cassandra. Currently, you must create a separate account for each API. Select Core (SQL) to create a document database and query by using SQL syntax. Learn more about the SQL API. |
Location | Select the region closest to your users | Select a geographic location to host your Azure Cosmos DB account. Use the location that is closest to your users to give them the fastest access to the data. |
Select Review + create. You can skip the Network and Tags sections.
Review the account settings, and then select Create. It takes a few minutes to create the account. Wait for the portal page to display Your deployment is complete.
Select Go to resource to go to the Azure Cosmos DB account page.
Select the Create a resource button found on the upper left-hand corner of the Azure portal, then select Compute > Function App.
Use the function app settings as specified in the table below the image.
Setting | Suggested value | Description |
---|---|---|
App name | Globally unique name | Name that identifies your new function app. Valid characters are a-z , 0-9 , and - . |
Subscription | Your subscription | The subscription under which this new function app is created. |
Resource Group | myResourceGroup | Name for the new resource group in which to create your function app. |
OS | Windows | Serverless hosting on Linux is currently in preview. For more information, see this considerations article. |
Hosting plan | Consumption plan | Hosting plan that defines how resources are allocated to your function app. In the default Consumption Plan, resources are added dynamically as required by your functions. In this serverless hosting, you only pay for the time your functions run. When you run in an App Service plan, you must manage the scaling of your function app. |
Location | West Europe | Choose a region near you or near other services your functions access. |
Runtime stack | Preferred language | Choose a runtime that supports your favorite function programming language. Choose .NET for C# and F# functions. |
Storage | Globally unique name | Create a storage account used by your function app. Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only. You can also use an existing account, which must meets the storage account requirements. |
Application Insights | Default | Creates an Application Insights resource of the same App name in the nearest supported region. By expanding this setting, you can change the New resource name or choose a different Location in an Azure geography where you want to store your data. |
Select Create to provision and deploy the function app.
Select the Notification icon in the upper-right corner of the portal and watch for the Deployment succeeded message.
Select Go to resource to view your new function app. You can also select Pin to dashboard. Pinning makes it easier to return to this function app resource from your dashboard.
Next, you create a function in the new function app.
Expand your function app and click the + button next to Functions. If this is the first function in your function app, select In-portal then Continue. Otherwise, go to step three.
Choose More templates then Finish and view templates.
In the search field, type cosmos
and then choose the Azure Cosmos DB trigger template.
If prompted, select Install to install the Azure Cosmos DB extension in the function app. After installation succeeds, select Continue.
Configure the new trigger with the settings as specified in the table below the image.
Setting | Suggested value | Description |
---|---|---|
Name | Default | Use the default function name suggested by the template. |
Azure Cosmos DB account connection | New setting | Select New, then choose your Subscription, the Database account you created earlier, and Select. This creates an application setting for your account connection. This setting is used by the binding to connection to the database. |
Collection name | Items | Name of collection to be monitored. |
Create lease collection if it doesn't exist | Checked | The collection doesn't already exist, so create it. |
Database name | Tasks | Name of database with the collection to be monitored. |
Click Create to create your Azure Cosmos DB triggered function. After the function is created, the template-based function code is displayed.
This function template writes the number of documents and the first document ID to the logs.
Next, you connect to your Azure Cosmos DB account and create the Items
collection in the Tasks
database.
Open a second instance of the Azure portal in a new tab in the browser.
On the left side of the portal, expand the icon bar, type cosmos
in the search field, and select Azure Cosmos DB.
Choose your Azure Cosmos DB account, then select the Data Explorer.
In Collections, choose taskDatabase and select New Collection.
In Add Collection, use the settings shown in the table below the image.
Setting | Suggested value | Description |
---|---|---|
Database ID | Tasks | The name for your new database. This must match the name defined in your function binding. |
Collection ID | Items | The name for the new collection. This must match the name defined in your function binding. |
Storage capacity | Fixed (10 GB) | Use the default value. This value is the storage capacity of the database. |
Throughput | 400 RU | Use the default value. If you want to reduce latency, you can scale up the throughput later. |
Partition key | /category | A partition key that distributes data evenly to each partition. Selecting the correct partition key is important in creating a performant collection. |
Click OK to create the Items collection. It may take a short time for the collection to get created.
After the collection specified in the function binding exists, you can test the function by adding documents to this new collection.
Expand the new taskCollection collection in Data Explorer, choose Documents, then select New Document.
Replace the contents of the new document with the following content, then choose Save.
Switch to the first browser tab that contains your function in the portal. Expand the function logs and verify that the new document has triggered the function. See that the task1
document ID value is written to the logs.
(Optional) Go back to your document, make a change, and click Update. Then, go back to the function logs and verify that the update has also triggered the function.
Other quick starts in this collection build upon this quick start. If you plan to work with subsequent quick starts, tutorials, or with any of the services you have created in this quick start, do not clean up the resources.
Resources in Azure refers to function apps, functions, storage accounts, and so forth. They are grouped into resource groups, and you can delete everything in a group by deleting the group.
You created resources to complete these quickstarts. You may be billed for these resources, depending on your account status and service pricing. If you don't need the resources anymore, here's how to delete them:
In the Azure portal, go to the Resource group page.
To get to that page from the function app page, select the Overview tab and then select the link under Resource group.
To get to that page from the dashboard, select Resource groups, and then select the resource group that you used for this quickstart.
In the Resource group page, review the list of included resources, and verify that they are the ones you want to delete.
Select Delete resource group, and follow the instructions.
Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can also select the bell icon at the top of the page to view the notification.
You have created a function that runs when a document is added or modified in your Azure Cosmos DB. For more information about Azure Cosmos DB triggers, see Azure Cosmos DB bindings for Azure Functions.
Now that you have created your first function, let's add an output binding to the function that writes a message to a Storage queue.
Amazon S3 service is used for file storage, where you can upload or remove files. We can trigger AWS Lambda on S3 when there are any file uploads in S3 buckets. AWS Lambda has a handler function which acts as a start point for AWS Lambda function. The handler has the details of the events. In this chapter, let us see how to use AWS S3 to trigger AWS Lambda function when we upload files in S3 bucket.
To start using AWS Lambda with Amazon S3, we need the following −
Let us see these steps with the help of an example which shows the basic interaction between Amazon S3 and AWS Lambda.
User will upload a file in Amazon S3 bucket
Once the file is uploaded, it will trigger AWS Lambda function in the background which will display an output in the form of a console message that the file is uploaded.
The user will be able to see the message in Cloudwatch logs once the file is uploaded.
The block diagram that explains the flow of the example is shown here −
Let us start first by creating a s3 bucket in AWS console using the steps given below −
Go to Amazon services and click S3 in storage section as highlighted in the image given below −
Click S3 storage and Create bucket which will store the files uploaded.
Once you click Create bucket button, you can see a screen as follows −
Enter the details Bucket name, Select the Region and click Create button at the bottom left side. Thus, we have created bucket with name : workingwithlambdaands3.
Now, click the bucket name and it will ask you to upload files as shown below −
Thus, we are done with bucket creation in S3.
To create role that works with S3 and Lambda, please follow the Steps given below −
Go to AWS services and select IAM as shown below −
Now, click IAM -> Roles as shown below −
Now, click Create role and choose the services that will use this role. Select Lambda and click Permission button.
Add the permission from below and click Review.
Observe that we have chosen the following permissions −
Observe that the Policies that we have selected are AmazonS3FullAccess, AWSLambdaFullAccess and CloudWatchFullAccess.
Now, enter the Role name, Role description and click Create Role button at the bottom.
Thus, our role named lambdawiths3service is created.
In this section, let us see how to create a Lambda function and add a S3 trigger to it. For this purpose, you will have to follow th Steps given below −
Go to AWS Services and select Lambda as shown below −
Click Lambda and follow the process for adding Name. Choose the Runtime, Role etc. and create the function. The Lambda function that we have created is shown in the screenshot below −
Now let us add the S3 trigger.
Choose the trigger from above and add the details as shown below −
Select the bucket created from bucket dropdown. The event type has following details −
Select Object Created (All), as we need AWS Lambda trigger when file is uploaded, removed etc.
You can add Prefix and File pattern which are used to filter the files added. For Example, to trigger lambda only for .jpg images. Let us keep it blank for now as we need to trigger Lambda for all files uploaded. Click Add button to add the trigger.
You can find the the trigger display for the Lambda function as shown below −
Let’s add the details for the aws lambda function. Here, we will use the online editor to add our code and use nodejs as the runtime environment.
To trigger S3 with AWS Lambda, we will have to use S3 event in the code as shown below −
Note that the event param has the details of the S3event. We have consoled the bucket name and the file name which will get logged when you upload image in S3bucket.
Now, let us save the changes and test the lambda function with S3upload. The following are the code details added in AWS Lambda −
Now, let us add the role, memory and timeout.
Now, save the Lambda function. Open S3 from Amazon services and open the bucket we created earlier namely workingwithlambdaands3.
Upload the image in it as shown below −
Click Upload button to add files as shown −
Click Add files to add files. You can also drag and drop the files. Now, click Upload button.
Thus, we have uploaded one image in our S3 bucket.
To see the trigger details, go to AWS service and select CloudWatch. Open the logs for the Lambda function and use the following code −
The output you can observe in Cloudwatch is as shown −
AWS Lambda function gets triggered when file is uploaded in S3 bucket and the details are logged in Cloudwatch as shown below −