Skip to content

Commit

Permalink
Updating Kubernetes deployment steps (Azure#180)
Browse files Browse the repository at this point in the history
* Updating Kubernetes deployment step

Signed-off-by: MD Ashique <[email protected]>

* Updating Kubernetes deployment step

Signed-off-by: MD Ashique <[email protected]>

* updating dapr configuration

Signed-off-by: MD Ashique <[email protected]>

* updating dapr configuration

Signed-off-by: MD Ashique <[email protected]>

* updating dapr configuration

Signed-off-by: MD Ashique <[email protected]>

* updating dapr configuration

Signed-off-by: MD Ashique <[email protected]>

* updating dapr configuration

Signed-off-by: MD Ashique <[email protected]>

* updating dapr configuration

Signed-off-by: MD Ashique <[email protected]>

---------

Signed-off-by: MD Ashique <[email protected]>
  • Loading branch information
ASHIQUEMD authored Oct 13, 2023
1 parent c54767e commit e33adc9
Show file tree
Hide file tree
Showing 8 changed files with 258 additions and 14 deletions.
6 changes: 3 additions & 3 deletions deploy/kubernetes/kubernetes-deployment.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

You can annotate your function Kubernetes deployments to include the Dapr sidecar.

> IMPORTANT: Port 3001 will only be exposed and listened if a Dapr trigger is defined in the function app. When using Dapr, the sidecar will wait to receive a response from the defined port before completing instantiation. This means it is important to NOT define the `dapr.io/port` annotation or `--app-port` unless you have a trigger. Doing so may lock your application from the Dapr sidecar. Port 3001 does not need to be exposed or defined if only using input and output bindings.
> IMPORTANT: Port 3001 will only be exposed and listened if a Dapr trigger is defined in the function app. When using Dapr, the sidecar will wait to receive a response from the defined port before completing instantiation. This means it is important to NOT define the `dapr.io/app-port` annotation or `--app-port` unless you have a trigger. Doing so may lock your application from the Dapr sidecar. Port 3001 does not need to be exposed or defined if only using input and output bindings.
To generate a Dockerfile for your app if you don't already have one, you can run the following command in your function project:
`func init --docker-only`.
Expand Down Expand Up @@ -55,9 +55,9 @@ spec:
app: my-function
annotations:
dapr.io/enabled: "true"
dapr.io/id: "functionapp"
dapr.io/app-id: "functionapp"
# Only define port of Dapr triggers are included
dapr.io/port: "3001"
dapr.io/app-port: "3001"
spec:
containers:
- name: my-function
Expand Down
172 changes: 172 additions & 0 deletions quickstarts/dotnet-isolated/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
# .NET Azure Function quicksart in Dotnet isolated mode

This quickstart will demonstrate how to use Azure Functions programming model to integrate with multiple Dapr components in dotnet isolated mode. Please first go through the [Dapr quickstarts](https://github.com/dapr/quickstarts) to get some contexts on various Dapr building blocks as well as go through Azure Functions [hello-world sample](https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-vs-code?pivots=programming-language-csharp) to familiarize with function programming model.
We'll be running a Darp'd function app locally:

This quickstart contains OrderService function app which below three azure functions
- **OrderSerivce** - This is Http Trigger function which internally does [Dapr Service Invocation](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/service-invocation-overview/) using Dapr invoke output binding.
- **CreateNewOrder** - This is Dapr Service Invocation enabled azure function which will be invoked by OrderService, and save the state to state store using [Dapr State Output Binding](https://docs.dapr.io/reference/api/state_api/#save-state).
- **RetrieveOrder** - This is Http Trigger function which uses [Dapr State Input Binding](https://docs.dapr.io/reference/api/state_api/#get-state) to read from state store.


## Prerequisites
This sample requires you to have the following installed on your machine:
- Setup Dapr: Follow instructions to [download and install the Dapr CLI](https://docs.dapr.io/getting-started/install-dapr-cli/) and [initialize Dapr](https://docs.dapr.io/getting-started/install-dapr-selfhost/).
- [Install Azure Functions Core Tool](https://github.com/Azure/azure-functions-core-tools/blob/master/README.md#windows)

# Step 1 - Understand the Settings

Now that we've locally set up Dapr, clone the repo, then navigate to the dotnet-isolated-azurefunction sample:

```bash
git clone https://github.com/Azure/azure-functions-dapr-extension.git
cd azure-functions-dapr-extension
dotnet build --configfile nuget.config
cd quickstarts/dotnet-isolated
```

In this folder, you will find `local.settings.json`, which lists a few app settings by the trigger/binding attributes.

```json
"StateStoreName": "statestore"
```

Dapr components: This quickstart uses default dapr components `(redis state store)` which gets installed in local when you perform `dapr init`.

You can find default dapr componentst at below location

**Windows:**
```
C:\Users\<username>\.dapr
```
**Mac:**
```
/Users/<username>/.dapr
```

# Step 2 - Run Function App with Dapr

Run function host with Dapr:

Windows (requires Dapr 1.12+)
```
dapr run -f .
```

Linux/Mac OS (requires Dapr 1.11+)
```
dapr run -f .
```

The command should output the dapr logs that look like the following:

```
Starting Dapr with id functionapp. HTTP Port: 3501. gRPC Port: 55377
Updating metadata for app command: func host start
You're up and running! Both Dapr and your app logs will appear here.
...
```

> **Note**: there are three ports in this service. The `--app-port`(3001) is where our function host listens on for any Dapr trigger. The `--dapr-http-port`(3501) is where Dapr APIs runs on as well as the grpc port. The function port (default 7071) is where function host listens on for any HTTP triggred function using `api/{functionName}` URl path. All of these ports are configurable.
>

# Step 3 - Understand the Sample

## 1. Service Invocation and State Management: Create New Order and Retrieve Order

Below is the Http Trigger function which internally does [Dapr Service Invocation](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/service-invocation-overview/) using Dapr invoke output binding.

```csharp
[Function("OrderService")]
[DaprInvokeOutput(AppId = "{appId}", MethodName = "{methodName}", HttpVerb = "post")]
public static async Task<InvokeMethodParameters> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "invoke/{appId}/{methodName}")] HttpRequestData req,
FunctionContext functionContext)
{
var log = functionContext.GetLogger("OrderService");
log.LogInformation("C# HTTP trigger function processed a request.");

string requestBody = await new StreamReader(req.Body).ReadToEndAsync();

//print the received payload
log.LogInformation($"Received Payload OrderService: {requestBody}");

var outputContent = new InvokeMethodParameters
{
Body = requestBody
};

return outputContent;
}
```

Below `DaprServiceInvocationTrigger` is used to receive and handle `CreateNewOrder` request and it first logs that this function is successfully triggered. Then it binds the content to the `JsonElement` object. The `DaprState` *output binding* will persist the order into the state store by serializing `JsonElement` object into a state arrary format and posting it to `http://localhost:${daprPort}/v1.0/state/${stateStoreName}`.

```csharp
[Function("CreateNewOrder")]
[DaprStateOutput("%StateStoreName%", Key = "order")]
public static JsonElement Run(
[DaprServiceInvocationTrigger] JsonElement payload,
FunctionContext functionContext)
{
var log = functionContext.GetLogger("CreateNewOrder");
log.LogInformation("C# function processed a CreateNewOrder request from the Dapr Runtime.");

//print the received payload
log.LogInformation($"Received Payload CreateNewOrder: {JsonSerializer.Serialize(payload)}");

// payload must be of the format { "data": { "value": "some value" } }
payload.TryGetProperty("data", out JsonElement data);

return data;
}
```

Now you can invoke this function by using either the [test.http](test.http) file with your favorite REST client, or use the Dapr cli in a new command line terminal.


Windows PowerShell
```powershell
dapr invoke --app-id functionapp --method CreateNewOrder --data '{ \"data\": {\"value\": { \"orderId\": \"42\" } } }'
```


In your terminal window, you should see logs indicating that the message was received and state was updated:

```
== APP == [TIMESTAMP] Executing 'Functions.CreateNewOrder' (Reason='(null)', Id=<ExecutionId>)
== APP == [TIMESTAMP] C# function processed a CreateNewOrder request from the Dapr Runtime.
== APP == [TIMESTAMP] Executed 'Functions.CreateNewOrder' (Succeeded, Id=<ExecutionId>, Duration=39ms)
```
----------------
In order to confirm the state is now persisted. You can now move to the next function:

```csharp
public static JsonElement Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "RetrieveOrder")] HttpRequestData req,
[DaprStateInput("%StateStoreName%", Key = "order")] JsonElement data,
FunctionContext functionContext)
{
var log = functionContext.GetLogger("RetrieveOrder");
log.LogInformation("C# function processed a RetrieveOrder request from the Dapr Runtime.");

//print the fetched state value
log.LogInformation($"Retrieved order: {JsonSerializer.Serialize(data)}");

return data;
}
```

You can use [test.http](test.http) get to read the data from state store.

```
== APP == [TIMESTAMP] Executing 'Functions.RetrieveOrder' (Reason='(null)', Id=<ExecutionId>)
== APP == [TIMESTAMP] {"orderId":"42"}
== APP == [TIMESTAMP] C# function processed a RetrieveOrder request from the Dapr Runtime.
== APP == [TIMESTAMP] Executed 'Functions.RetrieveOrder' (Succeeded, Id=<ExecutionId>, Duration=186ms)
```
## Follow below links to deploy the function in ACA and Kubernetes
### [Deploy Dapr enabled Function App to Azure Container Apps (ACA)](./deploy/aca/deploy-quickstart.bicep)

### [Deploy Dapr enabled Function App to Kubernetes](../../deploy/kubernetes/kubernetes-deployment.md)
7 changes: 7 additions & 0 deletions quickstarts/dotnet-isolated/dapr.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
version: 1
apps:
- appDirPath: .
appID: functionapp
appPort: 3001
daprHTTPPort: 3501
command: ["func", "start"]
13 changes: 13 additions & 0 deletions quickstarts/dotnet-isolated/test.http
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
POST http://localhost:7071/api/invoke/functionapp/CreateNewOrder
Content-Type: application/json

{
"data": {
"value": {
"orderId": "42"
}
}
}
###

GET http://localhost:7071/api/RetrieveOrder
33 changes: 23 additions & 10 deletions samples/dotnet-azurefunction/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -404,22 +404,24 @@ If you need a non-default namespace or in production environment, Helm has to be
```
helm repo add bitnami https://charts.bitnami.com/bitnami
helm repo update
kubectl create ns kafka
helm install dapr-kafka bitnami/kafka --wait --namespace kafka -f ./kafka-non-persistence.yaml
helm install dapr-kafka bitnami/kafka --wait -f ./kafka-non-persistence.yaml
```

- Run `kubectl -n kafka get pods -w` to see Kafka pods are running. This might take a few minute, but you should see.
- Run `kubectl get pods -w` to see Kafka pods are running. This might take a few minute, but you should see.
```
NAME READY STATUS RESTARTS AGE
dapr-kafka-0 1/1 Running 0 2m7s
dapr-kafka-zookeeper-0 1/1 Running 0 2m57s
dapr-kafka-zookeeper-1 1/1 Running 0 2m13s
dapr-kafka-zookeeper-2 1/1 Running 0 109s
dapr-kafka-controller-0 1/1 Running 0 53m
dapr-kafka-controller-1 1/1 Running 0 53m
dapr-kafka-controller-2 1/1 Running 0 53m
```
- Run `kubectl apply -f .\deploy\kafka.yaml` and observe that your kafka was successfully configured!
- Run `kubectl apply -f .\deploy\kafka-bindings.yaml` and observe that your kafka bindings component was successfully configured!
```
component.dapr.io/sample-topic created
```
- Run `kubectl apply -f .\deploy\kafka-pubsub.yaml` and observe that your kafka pub-sub component was successfully configured!
```
component.dapr.io/pubsub created
```
- Follow [secret management](https://docs.dapr.io/developing-applications/building-blocks/secrets/) instructions to securely manage your secrets in a production-grade application.
#### [Optional] Setting up the Pub/Sub in Kubernetes
Expand Down Expand Up @@ -477,8 +479,9 @@ spec:
app: functionapp
annotations:
dapr.io/enabled: "true"
dapr.io/id: "functionapp"
dapr.io/port: "<app-port>"
dapr.io/app-id: "functionapp"
# Only define port of Dapr triggers are included
dapr.io/app-port: "<app-port>"
spec:
containers:
- name: functionapp
Expand Down Expand Up @@ -512,6 +515,16 @@ dapr-sidecar-injector-675df889d5-22wxr 1/1 Running 0 10m
functionapp-6d4cc6b7f7-2p9n9 2/2 Running 0 8s
```

Run `kubectl get services functionapp` to see the public IP address, you can use this IP address access functions with http trigger.
```
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
functionapp LoadBalancer <external-ip> 80:32180/TCP 89m
```
You can use external-ip to invoke azure function as shown below
```
curl --location 'http://<external-ip>/api/StateInputBinding'
```

## Test your Dapr Function App
Now let's try invoke our function. You can use the follwoing commad to the logs. Use `--tail` to specify the last `n` lines of logs.
```powershell
Expand Down
14 changes: 14 additions & 0 deletions samples/dotnet-azurefunction/deploy/functionapp.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ spec:
annotations:
dapr.io/enabled: "true"
dapr.io/app-id: "functionapp"
# Only define port of Dapr triggers are included
dapr.io/app-port: "3001"
spec:
containers:
Expand All @@ -31,3 +32,16 @@ spec:
value: sample-topic
- name: PubSubName
value: pubsub
---
apiVersion: v1
kind: Service
metadata:
name: functionapp
spec:
selector:
app: functionapp
ports:
- protocol: TCP
port: 80
targetPort: 80
type: LoadBalancer
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
metadata:
# Kafka broker connection setting
- name: brokers
value: dapr-kafka.kafka:9092
value: "dapr-kafka.default.svc.cluster.local:9092" #localhost:9092 - for local kafka
# consumer configuration: topic and consumer group
- name: topics
value: sample
Expand Down
25 changes: 25 additions & 0 deletions samples/dotnet-azurefunction/deploy/kafka-pubsub.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: pubsub
namespace: default
spec:
type: pubsub.kafka
version: v1
metadata:
- name: brokers # Required. Kafka broker connection setting
value: "dapr-kafka.default.svc.cluster.local:9092" #localhost:9092 - for local kafka
- name: clientID # Optional. Used as client tracing ID by Kafka brokers.
value: "my-dapr-app-id"
- name: authType # Required.
value: "password"
- name: disableTls # Optional. Disable TLS. This is not safe for production!! You should read the `Mutual TLS` section for how to use TLS.
value: "true"
- name: "saslUsername"
value: "user1"
- name: saslPassword
# Required if authType is `password`. Make sure kafka is installed in the default namespace, otherwise secretKeyRef will not work.
# You can also provide directly password value instead of secretKeyRef. Read the kafka password with this command: `kubectl get secret dapr-kafka-user-passwords -o jsonpath='{.data.system-user-password}' | base64 --decode` Not recommended for production.
secretKeyRef:
name: dapr-kafka-user-passwords
key: system-user-password

0 comments on commit e33adc9

Please sign in to comment.