{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":584691077,"defaultBranch":"main","name":"obsrv-core","ownerLogin":"Sunbird-Obsrv","currentUserCanPush":false,"isFork":false,"isEmpty":false,"createdAt":"2023-01-03T09:10:23.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/98769945?v=4","public":true,"private":false,"isOrgOwned":true},"refInfo":{"name":"","listCacheKey":"v0:1725347297.0","currentOid":""},"activityList":{"items":[{"before":"c8b7d48ba98a715c8b3e8ae28129d9af71433885","after":null,"ref":"refs/heads/refactoring-v2","pushedAt":"2024-09-03T06:59:16.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"}},{"before":null,"after":"c8b7d48ba98a715c8b3e8ae28129d9af71433885","ref":"refs/heads/main","pushedAt":"2024-09-03T06:59:15.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Obsrv V2 Release (#19)\n\n* #OBS-I182: Fix the issue with cache indexer\r\n\r\n* #OBS-I182: Cache Indexer fix |Removed the kafka-client and casting the number to long value\r\n\r\n* #OBS-I182: Cache Indexer fix | Removing the obsrv_meta information before indexing into cache\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh Vasabhaktula ","shortMessageHtmlLink":"Obsrv V2 Release (#19)"}},{"before":"8d9afa41caade828328e1e01d3a16df443d16d36","after":null,"ref":"refs/heads/main","pushedAt":"2024-09-03T06:59:01.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"}},{"before":null,"after":"8d9afa41caade828328e1e01d3a16df443d16d36","ref":"refs/heads/main-old","pushedAt":"2024-09-03T06:59:00.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"#000: Git conflict resolve - Refactor V2 Changes","shortMessageHtmlLink":"#000: Git conflict resolve - Refactor V2 Changes"}},{"before":"67fd31027e392808f1b5b1879239e2b557333ca5","after":"8d9afa41caade828328e1e01d3a16df443d16d36","ref":"refs/heads/main","pushedAt":"2024-09-03T05:50:34.000Z","pushType":"push","commitsCount":48,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"#000: Git conflict resolve - Refactor V2 Changes","shortMessageHtmlLink":"#000: Git conflict resolve - Refactor V2 Changes"}},{"before":"49043d71769e1f761b0d905d9132b0a36aec16ac","after":"c8b7d48ba98a715c8b3e8ae28129d9af71433885","ref":"refs/heads/refactoring-v2","pushedAt":"2024-09-03T05:29:27.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Obsrv V2 Release (#19)\n\n* #OBS-I182: Fix the issue with cache indexer\r\n\r\n* #OBS-I182: Cache Indexer fix |Removed the kafka-client and casting the number to long value\r\n\r\n* #OBS-I182: Cache Indexer fix | Removing the obsrv_meta information before indexing into cache\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh Vasabhaktula ","shortMessageHtmlLink":"Obsrv V2 Release (#19)"}},{"before":"fae940ed41fb0094b5d5307606d7bb5d0e38e342","after":"67fd31027e392808f1b5b1879239e2b557333ca5","ref":"refs/heads/main","pushedAt":"2024-08-20T07:19:40.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":" 1.0.6.1-GA (#18)\n\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* feat: added descriptions for default configurations\r\n\r\n* feat: added descriptions for default configurations\r\n\r\n* feat: modified kafka connector input topic\r\n\r\n* feat: obsrv setup instructions\r\n\r\n* feat: revisiting open source features\r\n\r\n* feat: masterdata processor job config\r\n\r\n* Build deploy v2 (#19)\r\n\r\n* #0 - Refactor Dockerfile and Github actions workflow\r\n---------\r\n\r\nCo-authored-by: Santhosh Vasabhaktula \r\nCo-authored-by: ManojCKrishna \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* feat: add function to get all datasets\r\n\r\n* Release 1.3.1 into Main (#43)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Release 1.3.1 Changes (#42)\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* Main conflicts fixes (#44)\r\n\r\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* feat: add function to get all datasets\r\n\r\n* #000:feat: Resolve conflicts\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* update workflow file to skip tests (#45)\r\n\r\n* Release 1.3.1 into Main (#49)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Release 1.3.1 Changes (#42)\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* Main conflicts fixes (#44)\r\n\r\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* feat: add function to get all datasets\r\n\r\n* #000:feat: Resolve conflicts\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* #0000 - fix: Fix null dataset_type in DruidRouterFunction (#48)\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* Develop to Release-1.0.0-GA (#52) (#53)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #67 feat: query system configurations from meta store\r\n\r\n* #67 fix: Refactor system configuration retrieval and update dynamic router function\r\n\r\n* #67 fix: update system config according to review\r\n\r\n* #67 fix: update test cases for system config\r\n\r\n* #67 fix: update default values in test cases\r\n\r\n* #67 fix: add get all system settings method and update test cases\r\n\r\n* #67 fix: add test case for covering exception case\r\n\r\n* #67 fix: fix data types in test cases\r\n\r\n* #67 fix: Refactor event indexing in DynamicRouterFunction\r\n\r\n* Issue #67 refactor: SystemConfig read from DB implementation\r\n\r\n* #226 fix: update test cases according to the refactor\r\n\r\n---------\r\n\r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* Develop to 1.0.1-GA (#59) (#60)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #67 feat: query system configurations from meta store\r\n\r\n* #67 fix: Refactor system configuration retrieval and update dynamic router function\r\n\r\n* #67 fix: update system config according to review\r\n\r\n* #67 fix: update test cases for system config\r\n\r\n* #67 fix: update default values in test cases\r\n\r\n* #67 fix: add get all system settings method and update test cases\r\n\r\n* #67 fix: add test case for covering exception case\r\n\r\n* #67 fix: fix data types in test cases\r\n\r\n* #67 fix: Refactor event indexing in DynamicRouterFunction\r\n\r\n* Issue #67 refactor: SystemConfig read from DB implementation\r\n\r\n* #226 fix: update test cases according to the refactor\r\n\r\n* Dataset Registry Update (#57)\r\n\r\n* Issue #0000: feat: updateConnectorStats method includes last run timestamp\r\n\r\n* Issue #0000: fix: updateConnectorStats sql query updated\r\n\r\n* Issue #0000: fix: updateConnectorStats sql query updated\r\n\r\n---------\r\n\r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Shreyas Bhaktharam <121869503+shreyasb22@users.noreply.github.com>\r\n\r\n* Develop to 1.0.2-GA (#65) (#66)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #67 feat: query system configurations from meta store\r\n\r\n* #67 fix: Refactor system configuration retrieval and update dynamic router function\r\n\r\n* #67 fix: update system config according to review\r\n\r\n* #67 fix: update test cases for system config\r\n\r\n* #67 fix: update default values in test cases\r\n\r\n* #67 fix: add get all system settings method and update test cases\r\n\r\n* #67 fix: add test case for covering exception case\r\n\r\n* #67 fix: fix data types in test cases\r\n\r\n* #67 fix: Refactor event indexing in DynamicRouterFunction\r\n\r\n* Issue #67 refactor: SystemConfig read from DB implementation\r\n\r\n* #226 fix: update test cases according to the refactor\r\n\r\n* Dataset Registry Update (#57)\r\n\r\n* Issue #0000: feat: updateConnectorStats method includes last run timestamp\r\n\r\n* Issue #0000: fix: updateConnectorStats sql query updated\r\n\r\n* Issue #0000: fix: updateConnectorStats sql query updated\r\n\r\n* #0000 - fix: Fix Postgres connection issue with defaultDatasetID (#64)\r\n\r\n* Metrics implementation for MasterDataIndexerJob (#55)\r\n\r\n* Issue #50 fix: Kafka Metrics implementation for MasterDataIndexerJob\r\n\r\n* Issue #50 fix: Changed 'ets' to UTC\r\n\r\n* Issue #50 feat: added log statements\r\n\r\n* Issue #50 fix: FIxed issue related to update query\r\n\r\n* Issue #50 fix: Code refactoring\r\n\r\n* Issue #50 fix: updated implementation of 'createDataFile' method\r\n\r\n* Issue #50 fix: code refactorig\r\n\r\n* Issue #50 test: Test cases for MasterDataIndexer\r\n\r\n* Issue #50 test: test cases implementation\r\n\r\n* Issue #50 test: Test case implementation for data-products\r\n\r\n* Issue #50 test: Test cases\r\n\r\n* Issue #50 test: test cases\r\n\r\n* Issue #50 test: test cases for data-products\r\n\r\n* Issue #50-fix: fixed jackson-databind issue\r\n\r\n* Isuue-#50-fix: code structure modifications\r\n\r\n* Issue #50-fix: code refactoring\r\n\r\n* Issue #50-fix: code refactoing\r\n\r\n* Issue-#50-Fix: test case fixes\r\n\r\n* Issue #50-fix: code formatting and code fixes\r\n\r\n* feat #50 - refactor the implementation\r\n\r\n* Issue-#50-fix: test cases fix\r\n\r\n\r\n\r\n* modified README file\r\n\r\n\r\n\r\n* revert readme file changes\r\n\r\n\r\n\r\n* revert dataset-registry\r\n\r\n\r\n\r\n* Issue-#50-fix: test cases fix\r\n\r\n\r\n\r\n* Issue-#50-fix: adding missing tests\r\n\r\n\r\n\r\n* Issue-#50-fix: refatoring code\r\n\r\n\r\n\r\n* Issue-#50-fix: code fixes and code formatting\r\n\r\n\r\n\r\n* fix #50: modified class declaration\r\n\r\n\r\n\r\n* fix #50: code refactor\r\n\r\n\r\n\r\n* fix #50: code refactor\r\n\r\n\r\n\r\n* fix #50: test cases fixes\r\n\r\n\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n* Remove kafka connector as it is moved to a independent repository\r\n\r\n---------\r\n\r\nSigned-off-by: SurabhiAngadi \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Shreyas Bhaktharam <121869503+shreyasb22@users.noreply.github.com>\r\nCo-authored-by: SurabhiAngadi <138881390+SurabhiAngadi@users.noreply.github.com>\r\n\r\n* Release 1.0.3-GA (#72)\r\n\r\n* Pipeline Bug fixes (#74)\r\n\r\n* Sanketika-Obsrv/issue-tracker#106:fix: Fix postgres connection issue with dataset read and handling an errors while parsing the message\r\n\r\n* Sanketika-Obsrv/issue-tracker#107:fix: Denorm job fix to handle error when denorm field node is contains empty value\r\n\r\n* Sanketika-Obsrv/issue-tracker#106:fix: Review comments fix - Changed the generic exception to actual exception (NullPointer)\r\n\r\n* Pipeline Bug fixes (#74) (#77)\r\n\r\n* fix: #0000: update datasourceRef only if dataset has records\r\n\r\n* Sanketika-Obsrv/issue-tracker#180 fix: Datasource DB schema changes to include type. (#79)\r\n\r\nCo-authored-by: sowmya-dixit \r\n\r\n* Hudi connector flink job implementation (#80)\r\n\r\n* feat: Hudi Flink Implementation.\r\n* feat: local working with metastore and localstack.\r\n* #0000 - feat: Hudi Sink implementation\r\n* #0000 - feat: Hudi Sink implementation\r\n* #0000 - feat: Initialize dataset RowType during job startup\r\n* refactor: Integrate hudi connector with dataset registry.\r\n* refactor: Integrate hudi connector with dataset registry.\r\n* Sanketika-Obsrv/issue-tracker#141 refactor: Enable timestamp based partition\r\n* Sanketika-Obsrv/issue-tracker#141 refactor: Fix Hudi connector job to handle empty datasets list for lakehouse.\r\n* Sanketika-Obsrv/issue-tracker#141 fix: Set Timestamp based partition configurations only if partition key is of timestamp type.\r\n* Sanketika-Obsrv/issue-tracker#170 fix: Resolve timestamp based partition without using TimestampBasedAvroKeyGenerator.\r\n* Sanketika-Obsrv/issue-tracker#177 fix: Lakehouse connector flink job fixes.\r\n* Sanketika-Obsrv/issue-tracker#177 fix: Dockerfile changes for hudi-connector\r\n* Sanketika-Obsrv/issue-tracker#177 fix: Lakehouse connector flink job fixes.\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove unused code\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove unused code\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove unused code\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove commented code\r\n\r\n* Release 1.0.6-GA (#81)\r\n\r\n* Pipeline Bug fixes (#74)\r\n\r\n* Sanketika-Obsrv/issue-tracker#106:fix: Fix postgres connection issue with dataset read and handling an errors while parsing the message\r\n\r\n* Sanketika-Obsrv/issue-tracker#107:fix: Denorm job fix to handle error when denorm field node is contains empty value\r\n\r\n* Sanketika-Obsrv/issue-tracker#106:fix: Review comments fix - Changed the generic exception to actual exception (NullPointer)\r\n\r\n* fix: #0000: update datasourceRef only if dataset has records\r\n\r\n* Sanketika-Obsrv/issue-tracker#180 fix: Datasource DB schema changes to include type. (#79)\r\n\r\nCo-authored-by: sowmya-dixit \r\n\r\n* Hudi connector flink job implementation (#80)\r\n\r\n* feat: Hudi Flink Implementation.\r\n* feat: local working with metastore and localstack.\r\n* #0000 - feat: Hudi Sink implementation\r\n* #0000 - feat: Hudi Sink implementation\r\n* #0000 - feat: Initialize dataset RowType during job startup\r\n* refactor: Integrate hudi connector with dataset registry.\r\n* refactor: Integrate hudi connector with dataset registry.\r\n* Sanketika-Obsrv/issue-tracker#141 refactor: Enable timestamp based partition\r\n* Sanketika-Obsrv/issue-tracker#141 refactor: Fix Hudi connector job to handle empty datasets list for lakehouse.\r\n* Sanketika-Obsrv/issue-tracker#141 fix: Set Timestamp based partition configurations only if partition key is of timestamp type.\r\n* Sanketika-Obsrv/issue-tracker#170 fix: Resolve timestamp based partition without using TimestampBasedAvroKeyGenerator.\r\n* Sanketika-Obsrv/issue-tracker#177 fix: Lakehouse connector flink job fixes.\r\n* Sanketika-Obsrv/issue-tracker#177 fix: Dockerfile changes for hudi-connector\r\n* Sanketika-Obsrv/issue-tracker#177 fix: Lakehouse connector flink job fixes.\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove unused code\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove unused code\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove unused code\r\n* Sanketika-Obsrv/issue-tracker#177 fix: remove commented code\r\n\r\n---------\r\n\r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: SurabhiAngadi \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: sowmya-dixit \r\n\r\n* Sanketika-obsrv/issue-tracker#228 feat: updated github actions for lakehouse job\r\n\r\n* # Issue:52655194 Feat - Add processingStartTime if missing during extractor job (#76)\r\n\r\nCo-authored-by: Santhosh Vasabhaktula \r\n\r\n* Sanketika-obsrv/issue-tracker#240 feat: lakehouse job changes to support retire workflow\r\n\r\n* Sanketika-obsrv/issue-tracker#240 feat: master data enhancements for lakehouse\r\n\r\n* Migrating Raw SQL Statements to Prepared Statements (#87) (#88)\r\n\r\n* #OBS-I148: Migration of SQL queries to prepared statement to avoid the SQL injection\r\n\r\n* #OBS-I148: Migration of SQL queries to prepared statement to avoid the SQL injection\r\n\r\n* #OBS-I148: Removed the unwanted imports\r\n\r\n* #OBS-I148: System Config Changes - Converted from raw query to prepared statements\r\n\r\nCo-authored-by: Ravi Mula \r\n\r\n---------\r\n\r\nSigned-off-by: SurabhiAngadi \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: GayathriSrividya \r\nCo-authored-by: Manoj Krishna <92361832+ManojKrishnaChintauri@users.noreply.github.com>\r\nCo-authored-by: Santhosh Vasabhaktula \r\nCo-authored-by: ManojCKrishna \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\nCo-authored-by: Manoj Krishna <92361832+ManojKrishnaChintaluri@users.noreply.github.com>\r\nCo-authored-by: Shreyas Bhaktharam <121869503+shreyasb22@users.noreply.github.com>\r\nCo-authored-by: SurabhiAngadi <138881390+SurabhiAngadi@users.noreply.github.com>\r\nCo-authored-by: SurabhiAngadi \r\nCo-authored-by: sowmya-dixit \r\nCo-authored-by: GayathriSrividya \r\nCo-authored-by: GayathriSrividya ","shortMessageHtmlLink":" 1.0.6.1-GA (#18)"}},{"before":null,"after":"49043d71769e1f761b0d905d9132b0a36aec16ac","ref":"refs/heads/refactoring-v2","pushedAt":"2024-08-09T07:58:41.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"anandp504","name":"Anand Parthasarathy","path":"/anandp504","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5467020?s=80&v=4"},"commit":{"message":"feat: v2 Refactoring. Following are the changes done:\n1. removed the separation logic of master-dataset and dataset\n2. Merged both pipelines into one.\n3. Created cache indexer to index data into redis for master datasets similar to Hudi\n4. Upgraded the dataset config to the newer version\n5. Move the entry_topic as a separate field. This is to enable creation of multiple pipelines in the future","shortMessageHtmlLink":"feat: v2 Refactoring. Following are the changes done:"}},{"before":"fae940ed41fb0094b5d5307606d7bb5d0e38e342","after":null,"ref":"refs/heads/refactoring-v2","pushedAt":"2024-08-09T07:54:49.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"ravismula","name":"Ravi Mula","path":"/ravismula","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/7008102?s=80&v=4"}},{"before":null,"after":"fae940ed41fb0094b5d5307606d7bb5d0e38e342","ref":"refs/heads/refactoring-v2","pushedAt":"2024-08-09T07:53:19.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ravismula","name":"Ravi Mula","path":"/ravismula","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/7008102?s=80&v=4"},"commit":{"message":"Sunbird opensource release 2.0.1 GA (#15)","shortMessageHtmlLink":"Sunbird opensource release 2.0.1 GA (#15)"}},{"before":"fae940ed41fb0094b5d5307606d7bb5d0e38e342","after":null,"ref":"refs/heads/v2","pushedAt":"2024-08-09T07:45:08.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"ravismula","name":"Ravi Mula","path":"/ravismula","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/7008102?s=80&v=4"}},{"before":null,"after":"fae940ed41fb0094b5d5307606d7bb5d0e38e342","ref":"refs/heads/v2","pushedAt":"2024-08-09T07:30:34.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ravismula","name":"Ravi Mula","path":"/ravismula","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/7008102?s=80&v=4"},"commit":{"message":"Sunbird opensource release 2.0.1 GA (#15)","shortMessageHtmlLink":"Sunbird opensource release 2.0.1 GA (#15)"}},{"before":"620a3fc0f84537b83b10b094051f3acdabfb462b","after":"fae940ed41fb0094b5d5307606d7bb5d0e38e342","ref":"refs/heads/main","pushedAt":"2024-03-19T06:57:32.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Sunbird opensource release 2.0.1 GA (#15)","shortMessageHtmlLink":"Sunbird opensource release 2.0.1 GA (#15)"}},{"before":"454b0e112e8f81cc3a71230ebfa1798fec6809d1","after":"620a3fc0f84537b83b10b094051f3acdabfb462b","ref":"refs/heads/main","pushedAt":"2023-12-27T06:06:33.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Sunbird Obsrv opensource release 2.0.0-GA (#13)\n\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* feat: added descriptions for default configurations\r\n\r\n* feat: added descriptions for default configurations\r\n\r\n* feat: modified kafka connector input topic\r\n\r\n* feat: obsrv setup instructions\r\n\r\n* feat: revisiting open source features\r\n\r\n* feat: masterdata processor job config\r\n\r\n* Build deploy v2 (#19)\r\n\r\n* #0 - Refactor Dockerfile and Github actions workflow\r\n---------\r\n\r\nCo-authored-by: Santhosh Vasabhaktula \r\nCo-authored-by: ManojCKrishna \r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* feat: add function to get all datasets\r\n\r\n* Release 1.3.1 Changes (#42)\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* Main conflicts fixes (#44)\r\n\r\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* feat: add function to get all datasets\r\n\r\n* #000:feat: Resolve conflicts\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* Release 1.3.1 into Main (#43)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Release 1.3.1 Changes (#42)\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* Main conflicts fixes (#44)\r\n\r\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* feat: add function to get all datasets\r\n\r\n* #000:feat: Resolve conflicts\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* update workflow file to skip tests (#45)\r\n\r\n* Release 1.3.1 into Main (#49)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Release 1.3.1 Changes (#42)\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n---------\r\n\r\nCo-authored-by: Santhosh \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* Main conflicts fixes (#44)\r\n\r\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Issue #2 feat: Remove kafka connector code\r\n\r\n* feat: add function to get all datasets\r\n\r\n* #000:feat: Resolve conflicts\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* #0000 - fix: Fix null dataset_type in DruidRouterFunction (#48)\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\n\r\n* Develop to Release-1.0.0-GA (#52) (#53)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* feat: update all failed, invalid and duplicate topic names\r\n\r\n* feat: update kafka topic names in test cases\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* feat: update failed event\r\n\r\n* Update ErrorConstants.scala\r\n\r\n* feat: update failed event\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* feat: add exception handling for json deserialization\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* Update BaseProcessFunction.scala\r\n\r\n* feat: update batch failed event generation\r\n\r\n* Update ExtractionFunction.scala\r\n\r\n* feat: update invalid json exception handling\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* Issue #46 fix: remove cloning object\r\n\r\n* Issue #46 feat: update batch failed event\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n* Issue #46 feat: add error reasons\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Issue #46 feat: add exception stack trace\r\n\r\n* Dataset enhancements (#38)\r\n\r\n* feat: add connector config and connector stats update functions\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n* Update DatasetModels.scala\r\n* #0 fix: upgrade packages\r\n* #0 feat: add flink dockerfiles\r\n* #0 fix: add individual extraction\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n* #0000 [SV] - Fallback to local redis instance if embedded redis is not starting\r\n\r\n* Update DatasetModels.scala\r\n\r\n* #0000 - refactor the denormalization logic\r\n1. Do not fail the denormalization if the denorm key is missing\r\n2. Add clear message whether the denorm is sucessful or failed or partially successful\r\n3. Handle denorm for both text and number fields\r\n\r\n* #0000 - refactor:\r\n1. Created a enum for dataset status and ignore events if the dataset is not in Live status\r\n2. Created a outputtag for denorm failed stats\r\n3. Parse event validation failed messages into a case class\r\n\r\n* #0000 - refactor:\r\n1. Updated the DruidRouter job to publish data to router topics dynamically\r\n2. Updated framework to created dynamicKafkaSink object\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Made calls to getAllDatasets and getAllDatasetSources to always query postgres\r\n2. Created BaseDatasetProcessFunction for all flink functions to extend that would dynamically resolve dataset config, initialize metrics and handle common failures\r\n3. Refactored serde - merged map and string serialization into one function and parameterized the function\r\n4. Moved failed events sinking into a common base class\r\n5. Master dataset processor can now do denormalization with another master dataset as well\r\n\r\n* #0000 - mega refactoring:\r\n1. Added validation to check if the event has a timestamp key and it is not blank nor invalid\r\n2. Added timezone handling to store the data in druid in the TZ specified by the dataset\r\n\r\n\r\n* #0000 - minor refactoring: Updated DatasetRegistry.getDatasetSourceConfig to getAllDatasetSourceConfig\r\n\r\n* #0000 - mega refactoring: Refactored logs, error messages and metrics\r\n\r\n* #0000 - mega refactoring: Fix unit tests\r\n\r\n* #0000 - refactoring:\r\n1. Introduced transformation mode to enable lenient transformations\r\n2. Proper exception handling for transformer job\r\n\r\n* #0000 - refactoring: Fix test cases and code\r\n\r\n* #0000 - refactoring: upgrade embedded redis to work with macos sonoma m2\r\n\r\n* #0000 - refactoring: Denormalizer test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Router test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Validator test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: Framework test cases and bug fixes\r\n\r\n* #0000 - refactoring: kafka connector test cases and bug fixes. Code coverage is 100% now\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs\r\n\r\n* #0000 - refactoring: improve code coverage and fix bugs --- Now the code coverage is 100%\r\n\r\n* #0000 - refactoring: organize imports\r\n\r\n* #0000 - refactoring:\r\n1. transformer test cases and bug fixes - code coverage is 100%\r\n\r\n* #0000 - refactoring: test cases and bug fixes\r\n\r\n---------\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n* #000:feat: Removed the provided scope of the kafka-client in the framework (#40)\r\n\r\n* #0000 - feat: Add dataset-type to system events (#41)\r\n\r\n* #0000 - feat: Add dataset-type to system events\r\n\r\n* #0000 - feat: Modify tests for dataset-type in system events\r\n\r\n* #0000 - feat: Remove unused getDatasetType function\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #0000 - feat: Remove unused pom test dependencies\r\n\r\n* #67 feat: query system configurations from meta store\r\n\r\n* #67 fix: Refactor system configuration retrieval and update dynamic router function\r\n\r\n* #67 fix: update system config according to review\r\n\r\n* #67 fix: update test cases for system config\r\n\r\n* #67 fix: update default values in test cases\r\n\r\n* #67 fix: add get all system settings method and update test cases\r\n\r\n* #67 fix: add test case for covering exception case\r\n\r\n* #67 fix: fix data types in test cases\r\n\r\n* #67 fix: Refactor event indexing in DynamicRouterFunction\r\n\r\n* Issue #67 refactor: SystemConfig read from DB implementation\r\n\r\n* #226 fix: update test cases according to the refactor\r\n\r\n---------\r\n\r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Sowmya N Dixit \r\nCo-authored-by: Santhosh \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: Anand Parthasarathy \r\n\r\n* #0 fix: Flink base image updates\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: GayathriSrividya \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: Manoj Krishna <92361832+ManojKrishnaChintauri@users.noreply.github.com>\r\nCo-authored-by: Santhosh Vasabhaktula \r\nCo-authored-by: ManojCKrishna \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Anand Parthasarathy \r\nCo-authored-by: Ravi Mula \r\nCo-authored-by: Manoj Krishna <92361832+ManojKrishnaChintaluri@users.noreply.github.com>","shortMessageHtmlLink":"Sunbird Obsrv opensource release 2.0.0-GA (#13)"}},{"before":"1e5b677f810098ee410b78f62e74a4d9612ef029","after":"454b0e112e8f81cc3a71230ebfa1798fec6809d1","ref":"refs/heads/main","pushedAt":"2023-11-17T09:21:18.000Z","pushType":"pr_merge","commitsCount":1,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Obsrv 1.3.0 Release (#12)\n\n* feat: add connector config and connector stats update functions\r\n\r\n* Issue #33 feat: add documentation for Dataset, Datasources, Data In and Query APIs\r\n\r\n* feat: added descriptions for default configurations\r\n\r\n* feat: added descriptions for default configurations\r\n\r\n* feat: modified kafka connector input topic\r\n\r\n* feat: obsrv setup instructions\r\n\r\n* feat: revisiting open source features\r\n\r\n* feat: masterdata processor job config\r\n\r\n* Build deploy v2 (#19)\r\n\r\n* #0 - Refactor Dockerfile and Github actions workflow\r\n---------\r\n\r\nCo-authored-by: Santhosh Vasabhaktula \r\nCo-authored-by: ManojCKrishna \r\n\r\n* Update DatasetModels.scala\r\n\r\n* Release 1.3.0 into Main branch (#34)\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* testing new images\r\n\r\n* build new image with bug fixes\r\n\r\n* update dockerfile\r\n\r\n* update dockerfile\r\n\r\n* #0 fix: upgrade packages\r\n\r\n* #0 feat: add flink dockerfiles\r\n\r\n* #0 fix: add individual extraction\r\n\r\n* Issue #0 fix: upgrade ubuntu packages for vulnerabilities\r\n\r\n* #0 fix: update github actions release condition\r\n\r\n---------\r\n\r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>\r\nCo-authored-by: Sowmya N Dixit \r\n\r\n* Update DatasetModels.scala\r\n\r\n* fix: update flink base image\r\n\r\n* fix: update flink base image\r\n\r\n---------\r\n\r\nCo-authored-by: shiva-rakshith \r\nCo-authored-by: Aniket Sakinala \r\nCo-authored-by: GayathriSrividya \r\nCo-authored-by: Manjunath Davanam \r\nCo-authored-by: Manoj Krishna <92361832+ManojKrishnaChintauri@users.noreply.github.com>\r\nCo-authored-by: Santhosh Vasabhaktula \r\nCo-authored-by: ManojCKrishna \r\nCo-authored-by: ManojKrishnaChintaluri \r\nCo-authored-by: Praveen <66662436+pveleneni@users.noreply.github.com>","shortMessageHtmlLink":"Obsrv 1.3.0 Release (#12)"}},{"before":"12ce0808979fff76835b36b333b11d2715555104","after":"1e5b677f810098ee410b78f62e74a4d9612ef029","ref":"refs/heads/main","pushedAt":"2023-06-07T06:32:27.041Z","pushType":"pr_merge","commitsCount":3,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Merge pull request #8 from Sanketika-Obsrv/main\n\nFix the NPE and test cases for Dataset Registry","shortMessageHtmlLink":"Merge pull request #8 from Sanketika-Obsrv/main"}},{"before":"b15f9de138a217f6ef154e26b2dd1ed1fa0592f4","after":"12ce0808979fff76835b36b333b11d2715555104","ref":"refs/heads/main","pushedAt":"2023-06-01T14:19:16.315Z","pushType":"pr_merge","commitsCount":101,"pusher":{"login":"manjudr","name":"Manjunath Davanam","path":"/manjudr","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/16288575?s=80&v=4"},"commit":{"message":"Merge pull request #2 from Sanketika-Obsrv/main\n\n[v 0.5] - Obsrv Core","shortMessageHtmlLink":"Merge pull request #2 from Sanketika-Obsrv/main"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAAEqzv9dgA","startCursor":null,"endCursor":null}},"title":"Activity ยท Sunbird-Obsrv/obsrv-core"}