Skip to content

Event driven sales data analysis using S3, Event Bridge Pattern, Step Function, Lambda, SQS, DynamoDB

Notifications You must be signed in to change notification settings

yash872/Event-Driven-Sales-DataPipeline

Repository files navigation

Event-Driven-Sales-DataPipeline


Project Overview

This project is an overview of an Event Driven Sales Data Projection data pipeline that Process the Orders data based on their Status and route towards DynamoDB or SQS as per the Business requirement rules. We will design a system using AWS services such as S3, Lambda, DynamoDB, Step Function and Event Bridge.


Architectural Diagram

Event-Driven-SalesDataPipeline Architecture


Key Steps

1. Create a State Machine

  • We will create a State Machine "Sales-Data-StateMachine" using the Step Function service, which will start a workflow for the complete data process. stateMachine

2. Create a S3 Bucket

  • we will create a S3 bucket "orders-data-input-yb" for orders data json files upload S3

3. Create a Event Bridge Rule

  • we will create a Event Bridge rule "s3-orders-json-data" to trigger the step function when we got a new file upload in S3 bucket. EventRule

  • Note: We have to Enable from the S3 bucket to send Notification to Event Bridge. S3EventNoti

4. Create a Lambda Funaction

  • we will create a simple Lambda function "order-json-data-process" to validate the intrim data in state machine. it will return the successfull output if the input data is having 'contaxt-info' else it will raise an exception validationLambda

  • Current State Machine workflow stateMachine2

5. Create a DynamoDB Table

  • we will create a DynamoDB Table "valid-orders-data" to store the valid orders process by the lambda function. DynamoTable

6. Create a SQS

  • we will create a SQS "order-data-dlq" to store the failed data messages for reprocessing purpose if required. dlq

7. Complete the State Machine Flow

  • we will complete the State Machine workflow by adding the DynamoDB and SQS. stateMachine_final

8. Final Execution

  • we will upload the 'order-data.json' file in the S3 bucket, which will trigger the State Machine workflow.

  • we are having 2 records in the current data to test both the case of having contact info in the data.

  • Success

    • the record in which 'contact-info' is available will ingested as valid record in DynamoDB table. dynamoTrigger

    • we can see the record is inserted into the DynamoDB table. dynamoData

  • Failure

    • the record in which 'contact-info' is not available will be caught as exception an the output will be sent to SQS for review. sqsTrigger

    • we can see the failed record is in the SQS. sqsData

About

Event driven sales data analysis using S3, Event Bridge Pattern, Step Function, Lambda, SQS, DynamoDB

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages