In this article, I will demonstrate how to build an Azure Function Project using either HTTP Triggers or Timer Triggers.
We would also dive deeper by implementing an Azure Function that retrieves content from a Blob Storage on Azure.
Let's dive in!
Before we commence, I need to walk us through the key tools/technologies we would be employing:
1. Azure Function App ⚡️:
This is a container for one or more serverless functions. These functions deploy with only small pieces of code and respond to events (HTTP calls, Blob changes, etc.) and bind to other services.
2. Azure Storage Account ☁️:
Azure Storage is a service for highly available and massively scalable cloud storage.
3. Application Insights 🔍:
Azure Application Insights is a telemetry service that gives deep visibility into your live application.
4. C#/.NET 🛠:
C# is an object-oriented language developed by Microsoft. .NET is a cross-platform runtime and framework for building applications in C# (and other languages).
Project Architecture Diagram
This entire infrastructure will help us to store content, retrieve content, monitor the processes and automate the process.
Prerequisites
You'll need the following:
An Azure Account (free trial account is adequate)
Azure Core Tools (optional)
Code Editor (e.g. Visual Code)
Azure Extension in Code Editor
Let's build.
Note: Screenshots might display different resource names (for instance, storage account name) and this is because I created this project at different times, the screenshots were taken randomly. Follow the steps to the latter and you'll get it, trust me 😉
Step One : Create an Azure Storage Account
Create a new resource group or add to an existing resource group
Select LRS (GRS is recommended for production purposes).
Select Review + create
Wait for Deployment
Select newly created Storage Account > Select Security + networking > Select Access Keys > Copy Connection String (from any key) and paste it in your notes.
Step Two: Create a Function App
- Select Flex Consumption Hosting Plan
Reasons:
Deploying your function to Azure directly from your Visual Studio Code (with the publish feature or Command Line using Azure CLI) will fail because this feature known as "One deploy" is only supported on a Flex Consumption Plan.
For this project, we'll use a Flex Consumption Plan. However, I'd recommend a Consumption Plan in a production environment to reduce cost implications. To deploy on a Consumption Plan, you make use of Visual Studio. For more on deployment, visit Functions Deployment Technologies
- Select the same region as storage account. In my case, "East US".
Reasons:
Reduced Latency: Every time your function runtime needs to read or write (for triggers, logs etc) from the Storage Account, it would not have to make calls that would cross regions.
Cold-Start Impact: Flex Consumption Plans have pre-warm instances but cross-regional calls could still cause cold starts.
Select .NET as Runtime stack
Select Version 8 (LTS), isolated worker model
Select "Next : Storage >" and make sure the same storage account earlier created is selected.
Step Three: Create Function Project Locally
Open Code Editor (Visual Studio Code)
Select Terminal > New Terminal
Create new folder and change directory to that folder. e.g.
cd functionproject
Then create project with code below:
func init --worker-runtime dotnet-isolated --target-framework net8.0 --force
Once project is built successfully, open the local.settings.json file.
Change the value of AzureWebJobsStorage to the connection string you copied earlier.
This will connect your local functions project remotely to your storage account and allow successful calls between both parties.
It should look like this.
Step Four: Create a HTTP-triggered function
- Go back to your terminal and type:
func new --template "HTTP Trigger" --name "Echo"
At this junction, we have successfully created an HTTP Trigger 🥳 That was so easy, right? Well, we're not done yet 😂 Let's make some modifications to the code so it could function as what we want it to. If you don't know anything about C#, it is not a problem. We'll just be making slight modifications, nothing serious. However, if you're looking to learn the foundations of C#, I recommend FreeCodeCamp
Alright, let's continue.
Open the "Echo.cs" file that you created earlier.
Delete all the content.
Paste the code below
using System.Net;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;
namespace func
{
public class Echo
{
private readonly ILogger<Echo> _logger;
public Echo(ILogger<Echo> logger)
{
_logger = logger;
}
[Function("Echo")]
public async Task<HttpResponseData> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post")] HttpRequestData req)
{
_logger.LogInformation("C# HTTP trigger function processed a request.");
var response = req.CreateResponse(HttpStatusCode.OK);
response.Headers.Add("Content-Type", "text/plain; charset=utf-8");
// Asynchronous read of the request body
using var reader = new StreamReader(req.Body);
string requestBody = await reader.ReadToEndAsync();
await response.WriteStringAsync(requestBody);
return response;
}
}
}
Let's explain the some important parts of the code before you go on.
namespace func groups your code; you might have multiple function classes here.
public class Echo is the container for your function.
The Ilogger logger is injected by the Functions runtime, letting you write logs via _logger.LogInformation(), etc.
req.CreateResponse(HttpStatusCode.OK) instantiates a 200 OK HttpResponseData
response adds a ["Content-Type", "text/plain;] header so clients treat the body as plain text.
new StreamReader(req.Body) wraps the incoming request stream.
ReadToEndAsync() asynchronously reads the entire body into requestBody.
WriteStringAsync(requestBody) writes that same text back into the response.
Make sure every directive (using Microsoft.Azure.Functions.Worker.Http) used has a package reference in your "func.csproj" file. If you ever spot any missing, do this in your terminal (for instance):
dotnet add package Microsoft.Azure.Functions.Worker.Http
Also, make the changes below to your "Program.cs" file. We would need it for the Azure Function that will be handling blob content.
using Microsoft.Azure.Functions.Worker.Configuration;
using Microsoft.Extensions.Hosting;
var host = new HostBuilder()
.ConfigureFunctionsWorkerDefaults()
// Application Insights isn't enabled by default. See https://aka.ms/AAt8mw4.
// builder.Services
// .AddApplicationInsightsTelemetryWorkerService()
// .ConfigureFunctionsApplicationInsights();
.Build();
host.Run();
Alright we are good to go!
Let's test the HTTP trigger.
func start --build
- Open Terminal and Run the following command to test the POST REST API call against the URL provided in your terminal. It should look like this: http://localhost:7071/api/Echo
curl -X POST -i http://localhost:7071/api/echo -d "Hello"
You should have this response:
HTTP/1.1 200 OK
Content-Type: text/plain; charset=utf-8
Date: Fri, 16 May 2025 17:43:10 GMT
Server: Kestrel
Transfer-Encoding: chunked
Hello%
Amazing! You just successfully created a working HTTP Trigger! 🥂
Step Five: Create a Timer-triggered function
- Go to Terminal
func new --template "Timer trigger" --name "Recurring"
Open the "Recurring.cs" file
Replace "0 */5 * * * *" with "0 */1 * * * *" to set the recurring frequency interval to 1 minute rather than 5 minutes for quick testing.
Build and start function as before.
func start --build
- Observe the function code. You'd notice it triggers every 1 minute.
Now, for the final build. This is where we would be putting all we have learnt to test. Let's see the practicality of all we have been building.
Step Six: Create a HTTP-triggered function that integrates with Azure Blob Storage
Go back to your Azure Portal
We need to upload content to our Storage Account. Go to Storage Account > Data Storage > Containers
Create new container
Name it and create it (you'll need it in your code)
Select Upload
Search for the "local.settings.json" or any text file on your PC [for the purpose of this test, we'll be using a readable file]
Upload file
Create a HTTP-Triggered Function
Repeat steps from earlier build but name it "GetSettingInfo"
Add Azure Storage Blob extensions
dotnet add package Microsoft.Azure.Functions.Worker.Extensions.Storage --version 6.2.0
Open the "GetSettingInfo.cs" file
Edit it accordingly
using System.Net;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;
namespace func
{
public class GetSettingInfo
{
[Function("GetSettingInfo")]
public HttpResponseData Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post")] HttpRequestData req,
FunctionContext context,
[BlobInput("content/local.settings.json", Connection = "AzureWebJobsStorage")] string blobContent)
{
var logger = context.GetLogger<GetSettingInfo>();
logger.LogInformation("C# HTTP trigger function processed a request.");
logger.LogDebug("Blob content: {blob}", blobContent);
var response = req.CreateResponse(HttpStatusCode.OK);
response.Headers.Add("Content-Type", "application/json; charset=utf-8");
response.WriteString(blobContent);
return response;
}
}
}
On line 15, change "content/local.settings.json" to the "/ "
Build and start function
Test the function by using curl
Run the command to test the GET REST API call against http://localhost:7071/api/GetSettingInfo
curl -X GET -i http://localhost:7071/api/GetSettingInfo
- Observe the JSON content of the response. It should display the contents of the blob. It should look like this:
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Date: Fri, 16 May 2025 18:24:43 GMT
Server: Kestrel
Transfer-Encoding: chunked
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=projstore234;AccountKey=YzSM9dtGCNbY4AF6RNHM17h60BwpS5WdnkgMTDqJZbBBw6MAo1EhvjvIvlHQUgUXHBZYDai2r6KN+AStfV/vYQ==;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated"
}
}%
Deploy local function project to your Azure Function App
Download Azure Extension on your Code Editor if you haven't already.
Sign into your Azure Account.
Go to "Workspace" Section below > Hover on "Local Project" > Select Download icon.
NOTE: If you downloaded Azure Core Tools. You can deploy using Azure CLI
az login
func azure functionapp publish <function-app-name> --dotnet-version 8.0
- Once deployment is complete. Go to your Azure Portal > Go to Function App
- Make sure your function is running. Start Function App, if it isn't already running.
- Select "GetSettingInfo"
- Select the "Code + Test" option
- In the HTTP method drop-down list, select GET
- Select RUN to test function.
You should see the 200 OK status code and blob content.
Use logs to monitor function activity.
Application Insights was selected by default while the function App was being created. You can use it for more advanced log features (Live Metrics & Alerts etc.).
If the Application Insights doesn't work. Enable it in your "Program.cs" file with this code. Uncomment the code, if it already exists.
.ConfigureServices(s =>
{
s.AddApplicationInsightsTelemetryWorkerService();
s.ConfigureFunctionsApplicationInsights();
})
NOTE: Don't forget to stop your function app when you're done testing.
Congratulations! 👏🏾 You have successfully built your first function app.
Your Turn!
Have you built similar serverless config services? Drop a comment. You can also check out the sample code on GitHub:Azure Projects Repo
Lessons for Beginners
🙅🏾♂️ No pain, No gain:
As straightforward as this project is, it would take hours or even days—depending on your willpower to complete it but you know what, it's worth it! You learn by doing. 🔂 Doing it again and again will make you a subject matter expert in no time. Don't give up! 🙅🏾♂️💻 Having an Azure account / Lab Environment as a beginner is super important
You need to experiment, build, break and build again to really get it. Countless hours of debugging went into this project but seeing it work on Azure Portal was the cherry on the ice cream for me. If you don't see it work in a real environment, you might lose the drive to keep going.🔎 Find out why it worked too
If you can't replicate it, you don't know it. Ask Questions! Let the questions lead you to deeper research. You must read up on what you don't know by all means. Reading will aid your journey so much 📖. Even if you don't understand it yet, let your mind be conversant with the terminologies. It will eventually catch up.
4.👩🏾💻 Learn a programming language
Python, JavaScript / TypeScript, Java or C#/.NET are the most used languages for Azure Integration or development. It gives you a richer experience working with cloud tools.
5.🛠 One Tool at a time
I also fall victim of this a lot. I want to learn everything but end up knowing a bit of this and that which eventually makes me "a jack of all trades but master of none" . You can start with one or two tool/service(s) that have their operations linked together 🔗 [Azure Functions ⚡️, APIM]. Master 1-2 important tool(s) before you start exploring others. Yes, we need to know this and that tool if we want to prove our expertise but be patient ⏳ with what you're learning now until you can beat your chest and say you can handle that tool in your sleep.
Good tutorial , nice and clear.