Building Scalable and Maintainable .NET Microservices with Kubernetes, Kafka, and ELK Stack : Product and order microservices

DotNet Full Stack Dev
5 min readAug 22, 2024

--

In the world of modern software development, microservices have become a prominent architectural style due to their scalability, flexibility, and maintainability. By decomposing applications into smaller, independent services, developers can build, deploy, and scale each service separately. In this blog, we’ll walk through the process of creating Product and Order microservices using .NET Core, integrated with tools like Kubernetes for orchestration, Docker for containerization, Kafka for messaging, Datadog and Kibana for monitoring and logging, and MS SQL for database management. We’ll also discuss how to schedule tasks using AWS Cron Job.

Embark on a journey of continuous learning and exploration with DotNet-FullStack-Dev. Uncover more by visiting our https://dotnet-fullstack-dev.blogspot.com reach out for further information.

Prerequisites

Before diving into the implementation, ensure you have the following tools installed:

  • .NET Core SDK: To build and run .NET Core applications.
  • Docker: For containerizing the microservices.
  • Kubernetes: To manage and orchestrate the microservices.
  • Kafka: For message streaming between microservices.
  • Datadog: For monitoring application performance and health.
  • Kibana: For visualizing logs and tracing.
  • MS SQL Server: As the database for the microservices.
  • AWS: To set up cron jobs for scheduled tasks.

Step 1: Setting Up the Microservices Architecture

1.1 Define the Microservices

We’ll create two microservices: ProductService and OrderService.

  • ProductService: Manages the products catalog, including CRUD operations.
  • OrderService: Handles order processing, including placing and tracking orders.

Each service will have its own database schema in MS SQL.

1.2 Containerize the Services with Docker

Docker allows us to package the microservices along with their dependencies into containers.Create Dockerfiles for both microservices:

ProductService Dockerfile:

FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base
WORKDIR /app
EXPOSE 80

FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
WORKDIR /src
COPY ["ProductService/ProductService.csproj", "ProductService/"]
RUN dotnet restore "ProductService/ProductService.csproj"
COPY . .
WORKDIR "/src/ProductService"
RUN dotnet build "ProductService.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "ProductService.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "ProductService.dll"]

OrderService Dockerfile:

FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base
WORKDIR /app
EXPOSE 80

FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
WORKDIR /src
COPY ["OrderService/OrderService.csproj", "OrderService/"]
RUN dotnet restore "OrderService/OrderService.csproj"
COPY . .
WORKDIR "/src/OrderService"
RUN dotnet build "OrderService.csproj" -c Release -o /app/build

FROM build AS publish
RUN dotnet publish "OrderService.csproj" -c Release -o /app/publish

FROM base AS final
WORKDIR /app
COPY --from=publish /app/publish .
ENTRYPOINT ["dotnet", "OrderService.dll"]

2. Build Docker Images:

Run the following commands in the terminal:

docker build -t productservice -f ProductService/Dockerfile .
docker build -t orderservice -f OrderService/Dockerfile .

Step 2: Deploying Microservices with Kubernetes

Kubernetes allows us to orchestrate the deployment, scaling, and management of containerized applications.

  1. Create Kubernetes Deployment YAML Files:

ProductService Deployment:

apiVersion: apps/v1
kind: Deployment
metadata:
name: productservice-deployment
spec:
replicas: 2
selector:
matchLabels:
app: productservice
template:
metadata:
labels:
app: productservice
spec:
containers:
- name: productservice
image: productservice:latest
ports:
- containerPort: 80

OrderService Deployment:

apiVersion: apps/v1
kind: Deployment
metadata:
name: orderservice-deployment
spec:
replicas: 2
selector:
matchLabels:
app: orderservice
template:
metadata:
labels:
app: orderservice
spec:
containers:
- name: orderservice
image: orderservice:latest
ports:
- containerPort: 80

Deploy the Services:

Use the following commands to deploy the services to Kubernetes:

kubectl apply -f productservice-deployment.yaml
kubectl apply -f orderservice-deployment.yaml

Step 3: Implementing Messaging with Kafka

Kafka is used for communication between microservices, enabling real-time data streaming.

  1. Set Up Kafka and Zookeeper:

Start Kafka and Zookeeper using Docker:

docker-compose -f kafka-docker-compose.yml up -d

Integrate Kafka with .NET Core Microservices:

Add Kafka dependencies to both services:

dotnet add package Confluent.Kafka

Kafka Producer in ProductService:

var config = new ProducerConfig { BootstrapServers = "localhost:9092" };

using var producer = new ProducerBuilder<Null, string>(config).Build();
await producer.ProduceAsync("product-topic", new Message<Null, string> { Value = "New Product Added" });

Kafka Consumer in OrderService:

var config = new ConsumerConfig
{
GroupId = "order-service",
BootstrapServers = "localhost:9092",
AutoOffsetReset = AutoOffsetReset.Earliest
};

using var consumer = new ConsumerBuilder<Null, string>(config).Build();
consumer.Subscribe("product-topic");

while (true)
{
var consumeResult = consumer.Consume();
Console.WriteLine($"Consumed message: {consumeResult.Value}");
}

Step 4: Monitoring and Logging with Datadog and Kibana

Monitoring and logging are crucial for ensuring the health and performance of microservices.

4.1 Monitoring with Datadog

Datadog provides real-time monitoring of your applications.

Install Datadog Agent:

Run the Datadog Agent in your Kubernetes cluster:

helm install datadog-agent -f datadog-values.yaml datadog/datadog

Integrate Datadog in .NET Core:

Add Datadog tracing to your services:

dotnet add package Datadog.Trace

Configure Datadog in Program.cs:

Datadog.Trace.ClrProfiler.Instrumentation.LogsEnabled = true;

4.2 Logging with Kibana

Kibana provides powerful visualization capabilities for logs.

Set Up ELK Stack:

Deploy the ELK Stack (Elasticsearch, Logstash, Kibana) to your Kubernetes cluster.

helm install elasticsearch stable/elasticsearch
helm install logstash stable/logstash
helm install kibana stable/kibana

Configure Log Shipping:

Use Serilog for log shipping to Elasticsearch:

var logger = new LoggerConfiguration()
.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri("http://localhost:9200"))
{
AutoRegisterTemplate = true,
})
.CreateLogger();

Step 5: Scheduling Tasks with AWS Cron Jobs

In microservices architecture, certain tasks may need to be executed on a scheduled basis. AWS provides a way to schedule tasks using AWS Lambda with CloudWatch Events or by using Amazon ECS Scheduled Tasks. Here, we’ll focus on using AWS Lambda in combination with CloudWatch Events to create a cron job for scheduling tasks.

1. Create a Lambda Function in .NET Core

First, you’ll need to create a new AWS Lambda function in .NET Core. This function will contain the logic you want to execute on a schedule.

using Amazon.Lambda.Core;
using Amazon.Lambda.Serialization.SystemTextJson;
using System.Threading.Tasks;

[assembly: LambdaSerializer(typeof(DefaultLambdaJsonSerializer))]

namespace ScheduledTaskLambda
{
public class Function
{
public async Task FunctionHandler()
{
// Your scheduled task logic here
await Task.Run(() =>
{
// For example, clean up old records from the database
CleanupOldRecords();
});
}

private void CleanupOldRecords()
{
// Logic to clean up old records from the database
}
}
}

2. Deploy the Lambda Function

Next, deploy this Lambda function to AWS. You can use the AWS CLI, AWS Toolkit for Visual Studio, or AWS Management Console for deployment. Here is an example using the AWS CLI:

dotnet publish -c Release
zip -r function.zip ./bin/Release/netcoreapp3.1/publish/*
aws lambda create-function --function-name ScheduledTaskLambda --zip-file fileb://function.zip --handler ScheduledTaskLambda::ScheduledTaskLambda.Function::FunctionHandler --runtime dotnetcore3.1 --role arn:aws:iam::YOUR_ACCOUNT_ID:role/YOUR_LAMBDA_ROLE

3. Schedule the Lambda Function using CloudWatch Events

To schedule the Lambda function using CloudWatch Events, follow these steps:

  • Step 1: Navigate to the CloudWatch service in the AWS Management Console.
  • Step 2: Under Events, choose Rules, and then click Create Rule.
  • Step 3: In the Event Source section, choose Event Source as Schedule and define your cron expression (e.g., cron(0 12 * * ? *) for running the job every day at noon).
  • Step 4: In the Targets section, choose Lambda Function and select the Lambda function you deployed earlier.

Example cron expression:

cron(0 12 * * ? *) // Every day at 12:00 PM UTC

4. Testing the Scheduled Task

Once you’ve deployed and scheduled the Lambda function, you can monitor its execution via AWS CloudWatch Logs. The logs will help you verify that your scheduled task is running correctly.

Example of how you might write logs inside the Lambda function:

using Amazon.Lambda.Core;
using Amazon.Lambda.Serialization.SystemTextJson;

[assembly: LambdaSerializer(typeof(DefaultLambdaJsonSerializer))]

namespace ScheduledTaskLambda
{
public class Function
{
public async Task FunctionHandler()
{
LambdaLogger.Log("Scheduled Task Started.");

await Task.Run(() =>
{
CleanupOldRecords();
});

LambdaLogger.Log("Scheduled Task Completed.");
}

private void CleanupOldRecords()
{
// Logic to clean up old records from the database
}
}
}

Conclusion

In this blog, we’ve walked through the process of creating and deploying Product and Order microservices using .NET Core. We’ve leveraged various tools such as Docker for containerization, Kubernetes for orchestration, Kafka for messaging, Datadog and Kibana for monitoring and logging, MS SQL for database management, and AWS Cron Job for task scheduling. By combining these technologies, we’ve built a robust and scalable microservices architecture capable of handling real-world production workloads. This approach ensures that our microservices are not only scalable and maintainable but also easy to monitor, log, and manage.

--

--

DotNet Full Stack Dev
DotNet Full Stack Dev

Written by DotNet Full Stack Dev

Join me to master .NET Full Stack Development & boost your skills by 1% daily with insights, examples, and techniques! https://dotnet-fullstack-dev.blogspot.com

No responses yet