Skip to content

Commit

Permalink
Adding Python v2 samples (Azure#126)
Browse files Browse the repository at this point in the history
* Adding python v2 sample for state output binding

Signed-off-by: MD Ashique <[email protected]>

* Add samples for python v2 programming model

Signed-off-by: MD Ashique <[email protected]>

* removing route param

Signed-off-by: MD Ashique <[email protected]>

* Adding readme file for python v2 model samples

Signed-off-by: MD Ashique <[email protected]>

* Adding readme file for python v2 model samples

Signed-off-by: MD Ashique <[email protected]>

* Adding blueprint dapr function app

Signed-off-by: MD Ashique <[email protected]>

* Adding DaprInvokeOutput binding sample

Signed-off-by: MD Ashique <[email protected]>

* Add Java bindings for Dapr extension (Azure#131) (#11)

* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Adding java library build and release steps



* Modifying github workflow for java-library



* Update .github/workflows/build.yml



* Update .github/workflows/build.yml



* Update samples/dotnet-azurefunction/README.md



* Update samples/dotnet-azurefunction/README.md



* Update samples/dotnet-azurefunction/README.md



* Update samples/java-azurefunction/README.md



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprBindingOutput.java



* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Adding java library



* Update .github/workflows/build.yml



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprInvokeOutput.java



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprInvokeOutput.java



* Update samples/java-azurefunction/README.md



* Update samples/java-azurefunction/README.md



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprTopicTrigger.java



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprBindingOutput.java



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprTopicTrigger.java



* Adding java library



* Update java-library/src/main/java/com/microsoft/azure/functions/dapr/annotation/DaprBindingOutput.java



* Update docs/development/release-process.md



* Adding java library



---------

Signed-off-by: MD Ashique <[email protected]>
Co-authored-by: Shubham Sharma <[email protected]>

* move function app to individual blueprint

Signed-off-by: MD Ashique <[email protected]>

---------

Signed-off-by: MD Ashique <[email protected]>
Co-authored-by: Shubham Sharma <[email protected]>
  • Loading branch information
ASHIQUEMD and shubham1172 authored Sep 20, 2023
1 parent 93fc0fc commit 422382f
Show file tree
Hide file tree
Showing 17 changed files with 596 additions and 7 deletions.
16 changes: 9 additions & 7 deletions samples/python-azurefunction/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Python Azure Function Sample

This tutorial will demonstrate how to use Azure Functions Python programming model to integrate with multiple Dapr components. Please first go through the [samples](https://github.com/dapr/samples) to get some contexts on various Dapr building blocks as well as go through Azure Functions [hello-world sample](https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-vs-code?pivots=programming-language-python) to familiarize with function programming model.
This tutorial will demonstrate how to use Azure Functions Python programming model to integrate with multiple Dapr components. Please first go through the [samples](https://github.com/dapr/samples) to get some contexts on various Dapr building blocks as well as go through Azure Functions [hello-world sample](https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-function-vs-code?pivots=programming-language-python) to familiarize with function programming model. [Please explore Functions Dapr extension with Python V2 programming model samples for simplified development](https://github.com/Azure/azure-functions-dapr-extension/tree/master/samples/python-v2-azurefunctions).

We'll be running a Darp'd function app locally:
1) Invoked by [Dapr Service Invocation](https://docs.dapr.io/developing-applications/building-blocks/service-invocation/service-invocation-overview/) and persist/retrieve state using [Dapr State Management](https://github.com/dapr/components-contrib/tree/master/state)
2) Publish/consume message on a specific topic powered by [Dapr pub/sub](https://github.com/dapr/components-contrib/tree/master/pubsub) and `DaprPublish`/`DaprTopicTrigger`
Expand Down Expand Up @@ -135,13 +136,14 @@ import logging
import json
import azure.functions as func


def main(payload,
order: func.Out[bytes]) -> None:
order: func.Out[str]) -> None:
logging.info(
'Python function processed a TransferEventBetweenTopics request from the Dapr Runtime.')
subEvent_json = json.loads(subEvent)
payload = "Transfer from Topic A: " + str(subEvent_json["data"])
pubEvent.set(json.dumps({"payload": payload}).encode('utf-8'))
'Python function processed a CreateNewOrder request from the Dapr Runtime.')
payload_json = json.loads(payload)
logging.info(payload_json["data"])
order.set(json.dumps({"value": payload_json["data"]}))
```

```json
Expand Down Expand Up @@ -220,7 +222,7 @@ In your terminal window, you should see logs indicating that the message was rec
== APP == [TIMESTAMP] Executed 'CreateNewOrder' (Succeeded, Id=<ExecutionId>)
```
----------------
In order to confirm the state is now persisted.], you can move to the next function:
In order to confirm the state is now persisted, you can move to the next function:

```python
def main(payload, data: str) -> None:
Expand Down
42 changes: 42 additions & 0 deletions samples/python-v2-azurefunctions/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
bin
obj
csx
.vs
edge
Publish

*.user
*.suo
*.cscfg
*.Cache
project.lock.json

/packages
/TestResults

/tools/NuGet.exe
/App_Data
/secrets
/data
.secrets
appsettings.json

node_modules
dist

# Local python packages
.python_packages/

# Python Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
11 changes: 11 additions & 0 deletions samples/python-v2-azurefunctions/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# To enable ssh & remote debugging on app service change the base image to the one below
# FROM mcr.microsoft.com/azure-functions/python:3.0-python3.7-appservice
FROM mcr.microsoft.com/azure-functions/python:3.0-python3.7

ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
AzureFunctionsJobHost__Logging__Console__IsEnabled=true

COPY requirements.txt /
RUN pip install -r /requirements.txt

COPY . /home/site/wwwroot
364 changes: 364 additions & 0 deletions samples/python-v2-azurefunctions/README.md

Large diffs are not rendered by default.

11 changes: 11 additions & 0 deletions samples/python-v2-azurefunctions/consume_message_from_kafka.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
import logging
import azure.functions as func

consumeMessageFromKafka = func.DaprBlueprint()

# Dapr binding trigger
@consumeMessageFromKafka.function_name(name="ConsumeMessageFromKafka")
@consumeMessageFromKafka.dapr_binding_trigger(arg_name="triggerData", binding_name="%KafkaBindingName%")
def main(triggerData: str) -> None:
logging.info('Python function processed a ConsumeMessageFromKafka request from the Dapr Runtime.')
logging.info('Trigger data: ' + triggerData)
17 changes: 17 additions & 0 deletions samples/python-v2-azurefunctions/create_new_order.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
import logging
import azure.functions as func

createNewOrder = func.DaprBlueprint()

# Dapr state output binding with http dapr_service_invocation_trigger
@createNewOrder.function_name(name="CreateNewOrder")
@createNewOrder.dapr_service_invocation_trigger(arg_name="payload", method_name="CreateNewOrder")
@createNewOrder.dapr_state_output(arg_name="state", state_store="%StateStoreName%", key="order")
def main(payload: str, state: func.Out[str] ) :
# request body must be passed this way '{\"value\": { \"key\": \"some value\" } }'
logging.info('Python function processed a CreateNewOrder request from the Dapr Runtime.')
logging.info(payload)
if payload is not None:
state.set(payload)
else:
logging.info('payload is none')
13 changes: 13 additions & 0 deletions samples/python-v2-azurefunctions/extensions.csproj
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<WarningsAsErrors></WarningsAsErrors>
<DefaultItemExcludes>**</DefaultItemExcludes>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Azure.WebJobs.Script.ExtensionsMetadataGenerator" Version="1.1.3" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\Microsoft.Azure.WebJobs.Extensions.Dapr\Microsoft.Azure.WebJobs.Extensions.Dapr.csproj" />
</ItemGroup>
</Project>
20 changes: 20 additions & 0 deletions samples/python-v2-azurefunctions/function_app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import azure.functions as func

from create_new_order import createNewOrder
from consume_message_from_kafka import consumeMessageFromKafka
from invoke_output_binding import invokeOutputBinding
from print_topic_message import printTopicMessage
from retrieve_order import retrieveOrder
from retrieve_secret import retrieveSecret
from send_message_to_kafka import sendMessageToKafka
from transfer_event_between_topics import transferEventBetweenTopics

dapp = func.DaprFunctionApp()
dapp.register_blueprint(createNewOrder)
dapp.register_blueprint(consumeMessageFromKafka)
dapp.register_blueprint(invokeOutputBinding)
dapp.register_blueprint(printTopicMessage)
dapp.register_blueprint(retrieveOrder)
dapp.register_blueprint(retrieveSecret)
dapp.register_blueprint(sendMessageToKafka)
dapp.register_blueprint(transferEventBetweenTopics)
3 changes: 3 additions & 0 deletions samples/python-v2-azurefunctions/host.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"version": "2.0"
}
20 changes: 20 additions & 0 deletions samples/python-v2-azurefunctions/invoke_output_binding.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import logging
import azure.functions as func

invokeOutputBinding = func.DaprBlueprint()

# Dapr invoke output binding with http trigger
@invokeOutputBinding.function_name(name="InvokeOutputBinding")
@invokeOutputBinding.route(route="invoke/{appId}/{methodName}", auth_level=func.AuthLevel.ANONYMOUS)
@invokeOutputBinding.dapr_invoke_output(arg_name = "payload", app_id = "{appId}", method_name = "{methodName}", http_verb = "post")
def main(req: func.HttpRequest, payload: func.Out[str] ) -> str:
# request body must be passed this way "{\"body\":{\"value\":{\"key\":\"some value\"}}}" to use the InvokeOutputBinding, all the data must be enclosed in body property.
logging.info('Python function processed a InvokeOutputBinding request from the Dapr Runtime.')

body = req.get_body()
logging.info(body)
if body is not None:
payload.set(body)
else:
logging.info('req body is none')
return 'ok'
11 changes: 11 additions & 0 deletions samples/python-v2-azurefunctions/local.settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsFeatureFlags":"EnableWorkerIndexing",
"FUNCTIONS_WORKER_RUNTIME": "python",
"PubSubName": "messagebus", // should be same as metatdata.name in components/messagebus.yaml
"StateStoreName": "statestore", // should be same as metatdata.name in components/statestore.yaml
"KafkaBindingName": "sample-topic" // should be same as metatdata.name in components/kafka_binding.yaml
}
}
13 changes: 13 additions & 0 deletions samples/python-v2-azurefunctions/print_topic_message.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
import json
import logging
import azure.functions as func

printTopicMessage = func.DaprBlueprint()

# Dapr topic trigger
@printTopicMessage.function_name(name="PrintTopicMessage")
@printTopicMessage.dapr_topic_trigger(arg_name="subEvent", pub_sub_name="%PubSubName%", topic="B", route="B")
def main(subEvent) -> None:
logging.info('Python function processed a PrintTopicMessage request from the Dapr Runtime.')
subEvent_json = json.loads(subEvent)
logging.info("Topic B received a message: " + subEvent_json["data"])
1 change: 1 addition & 0 deletions samples/python-v2-azurefunctions/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
azure-functions
13 changes: 13 additions & 0 deletions samples/python-v2-azurefunctions/retrieve_order.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
import logging
import azure.functions as func

retrieveOrder = func.DaprBlueprint()

# Dapr state input binding with http dapr_service_invocation_trigger
@retrieveOrder.function_name(name="RetrieveOrder")
@retrieveOrder.dapr_service_invocation_trigger(arg_name="payload", method_name="RetrieveOrder")
@retrieveOrder.dapr_state_input(arg_name="data", state_store="%StateStoreName%", key="order")
def main(payload, data: str) :
# Function should be invoked with this command: dapr invoke --app-id functionapp --method RetrieveOrder --data '{}'
logging.info('Python function processed a RetrieveOrder request from the Dapr Runtime.')
logging.info(data)
18 changes: 18 additions & 0 deletions samples/python-v2-azurefunctions/retrieve_secret.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import json
import logging
import azure.functions as func

retrieveSecret = func.DaprBlueprint()

# Dapr secret input binding with http dapr_service_invocation_trigger
@retrieveSecret.function_name(name="RetrieveSecret")
@retrieveSecret.dapr_service_invocation_trigger(arg_name="payload", method_name="RetrieveSecret")
@retrieveSecret.dapr_secret_input(arg_name="secret", secret_store_name="localsecretstore", key="my-secret", metadata="metadata.namespace=default")
def main(payload, secret: str) :
# Function should be invoked with this command: dapr invoke --app-id functionapp --method RetrieveSecret --data '{}'
logging.info('Python function processed a RetrieveSecret request from the Dapr Runtime.')
secret_dict = json.loads(secret)

for key in secret_dict:
logging.info("Stored secret: Key = " + key +
', Value = ' + secret_dict[key])
14 changes: 14 additions & 0 deletions samples/python-v2-azurefunctions/send_message_to_kafka.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
import json
import logging
import azure.functions as func

sendMessageToKafka = func.DaprBlueprint()

# Dapr binding output
# Dapr state output binding with http dapr_service_invocation_trigger
@sendMessageToKafka.function_name(name="SendMessageToKafka")
@sendMessageToKafka.dapr_service_invocation_trigger(arg_name="payload", method_name="SendMessageToKafka")
@sendMessageToKafka.dapr_binding_output(arg_name="messages", binding_name="%KafkaBindingName%", operation="create")
def main(payload: str, messages: func.Out[bytes]) -> None:
logging.info('Python processed a SendMessageToKafka request from the Dapr Runtime.')
messages.set(json.dumps({"data": payload}).encode('utf-8'))
16 changes: 16 additions & 0 deletions samples/python-v2-azurefunctions/transfer_event_between_topics.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
import json
import logging
import azure.functions as func

transferEventBetweenTopics = func.DaprBlueprint()

# Dapr publish output
# Dapr topic trigger with dapr_publish_output
@transferEventBetweenTopics.function_name(name="TransferEventBetweenTopics")
@transferEventBetweenTopics.dapr_topic_trigger(arg_name="subEvent", pub_sub_name="%PubSubName%", topic="A", route="A")
@transferEventBetweenTopics.dapr_publish_output(arg_name="pubEvent", pub_sub_name="%PubSubName%", topic="B")
def main(subEvent, pubEvent: func.Out[bytes]) -> None:
logging.info('Python function processed a TransferEventBetweenTopics request from the Dapr Runtime.')
subEvent_json = json.loads(subEvent)
payload = "Transfer from Topic A: " + str(subEvent_json["data"])
pubEvent.set(json.dumps({"payload": payload}).encode('utf-8'))

0 comments on commit 422382f

Please sign in to comment.