diff --git a/README.md b/README.md index 1c2e0721..ba3b3807 100644 --- a/README.md +++ b/README.md @@ -1,28 +1,21 @@ -# dCNiOS +# DCNiOS -[dCache](http://dcache.org) is a system for storing and retrieving huge amounts of data, distributed among a large number of heterogeneous server nodes, under a single virtual filesystem tree with a variety of standard access methods. + +DCNiOS is an open-source command-line tool to easily manage the creation of event-driven data processing flows. DCNiOS, Data Connector through Apache NiFi for OSCAR, facilitates the creation of event-driven processes connecting a Storage System like [dCache](http://dcache.org) or [S3](https://aws.amazon.com/s3) to a scalable OSCAR cluster by employing predefined dataflows that are processed by Apache NiFi. [Apache NiFi](http://nifi.apache.org) is a reliable system to process and distribute data through powerful and scalable directed graphs of data routing, transformation, and system mediation logic. [OSCAR](https://oscar.grycap.net) is an open-source platform for serverless event-driven data processing of containerized applications across the computing continuum. -Together with [dCNiOS](http://github.com/grycap/dcnios) (dCache + NiFi + OSCAR), you can manage the creation of event-driven data processing flows. As shown in the figure, when files are uploaded to dCache, events are ingested in Apache NiFi, which can queue them up depending on the (modifiable at runtime) ingestion rate, to be then delegated for processing into a scalable OSCAR cluster, where a user-defined application based on a Docker image can process the data file. - -dCNiOS Workflow - -Therefore, dCNiOS has been made to interact with NiFi and deploy a complete dataflow. It uses HTTP calls to communicate with a Nifi cluster, which can be automatically deployed by the [Infrastructure Manager (IM)](https://im.egi.eu). Apache NiFi is deployed on a dynamically provisioned Kubernetes cluster running with a custom Docker image named `ghcr.io/grycap/nifi-sse:latest`. This new image includes a client for the [dCache SSE Event Interface](https://www.dcache.org/manuals/UserGuide-8.2/frontend.shtml#storage-events), kindly provided by Paul Millar in [GitHub](https://github.com/paulmillar/dcache-sse). It does not require a Nifi registry. - -All the dataflow information is described in a YAML file, and by executing the dCNiOS command-line interface, this dataflow is deployed on Nifi. - -From predefined recipes (ProcessGroup in Nifi, .json files) created before, +Together with [DCNiOS](http://github.com/grycap/dcnios) (Data Connector + NiFi + OSCAR), you can manage the creation of event-driven data processing flows. As shown in the figure, when an event occurs in the external component, dCache in this case, events are ingested in Apache NiFi, which can queue them up depending on the (modifiable at runtime) ingestion rate, to be then delegated for processing into a scalable OSCAR cluster, where a user-defined application based on a Docker image can process the data file. -dCNiOS inserts a general flow and changes the variables to create a concrete workflow. +DCNiOS Workflow -By default, two process group recipes have been created: +Therefore, DCNiOS has been made to interact with NiFi and deploy a complete dataflow. It uses HTTP calls to communicate with a Nifi cluster, which can be automatically deployed by the [Infrastructure Manager (IM)](https://im.egi.eu). Apache NiFi is deployed on a dynamically provisioned Kubernetes. It does not require a Nifi registry. +All the dataflow information is described in a YAML file, and by executing the DCNiOS command-line interface, this dataflow is deployed on Nifi. -1. dcache, which is an active listener for a dCache instance. The [Server-sent Events SSE](https://en.wikipedia.org/wiki/Server-sent_events) client actively listens for these events in a user-defined folder in dCache. When a file is uploaded to that folder in dCache, NiFi will introduce the event in the dataflow. -2. InvokeOSCAR, an HTTP call to invoke an OSCAR service asynchronously. OSCAR supports this events specification to let the user decide whether the file should be pre-staged into the execution sandbox to locally process the data within an OSCAR job or to delegate the processing of the event into an external tool, such as a workflow orchestration platform, thus reducing data movements across the systems. +From predefined recipes (ProcessGroup in Nifi, .json files) created before, DCNiOS inserts a general flow and changes the variables to create a concrete workflow. ## Getting Started @@ -48,7 +41,7 @@ Install all the requirements defined in `requirements.txt` pip install -r requeriments.txt ``` -Or only install the minimal requirements that dCNiOS needs. +Or only install the minimal requirements that DCNiOS needs. ``` bash @@ -73,14 +66,23 @@ There is only one version in maintenance: ## Licensing -dCNiOS is licensed under the Apache License, Version 2.0. See LICENSE for the full license text. +DCNiOS is licensed under the Apache License, Version 2.0. See LICENSE for the full license text. ## Acknowledgements This work was supported by the project “An interdisciplinary Digital Twin Engine for science’’ (interTwin), which has received funding from the European Union’s Horizon Europe Programme under Grant 101058386. -dCNiOS Workflow +DCNiOS Workflow ## More information You can find more [information](https://oscar.grycap.net/blog/data-driven-processing-with-dcache-nifi-oscar/ ) in the [OSCAR's blog.](https://oscar.grycap.net/blog/) + + + + + Silver Badge + + +This software has received a silver badge according to the [Software Quality Baseline criteria](https://www.eosc-synergy.eu/for-developers/) defined by the [EOSC-Synergy](https://www.eosc-synergy.eu) project. Please acknowledge the use of DCNiOS by citing the following scientific +publications ([preprints available](https://www.grycap.upv.es/gmolto/publications)): \ No newline at end of file diff --git a/cli/alterations/alteration.py b/cli/alterations/alteration.py index b9afe876..9d3b57f2 100644 --- a/cli/alterations/alteration.py +++ b/cli/alterations/alteration.py @@ -15,31 +15,31 @@ # !/usr/bin/env python3 -from apis import NifiManagment +from apis import nifiManagment from apis import auxiliaryFunctions def createMerge(nifiConfiguration,information, name): nameCompose= nameActionReturn(information["action"],name) - merge=auxiliaryFunctions.prepareforAll("./template/Alterations/Merge.json",information) - merge = auxiliaryFunctions.addSensibleVariable(merge, "MergeContent", "Maximum Number of Entries", information["maxMessages"]) + merge=auxiliaryFunctions.prepareforAll("./template/alterations/Merge.json",information) + merge = auxiliaryFunctions.addSensitiveVariable(merge, "MergeContent", "Maximum Number of Entries", information["maxMessages"]) nifiConfiguration.create(nameCompose, merge) nifiConfiguration.changeSchedule(nameCompose, "MergeContent", information["windowSeconds"]) def createDecode(nifiConfiguration,information,name): nameCompose= nameActionReturn(information["action"],name) - merge=auxiliaryFunctions.prepareforAll("./template/Alterations/Encode_Decode.json",information) - merge = auxiliaryFunctions.addSensibleVariable(merge, "EncodeContent", "Mode", "Decode") + merge=auxiliaryFunctions.prepareforAll("./template/alterations/Encode_Decode.json",information) + merge = auxiliaryFunctions.addSensitiveVariable(merge, "EncodeContent", "Mode", "Decode") if "Encoding" in information: - merge = auxiliaryFunctions.addSensibleVariable(merge, "EncodeContent", "Encoding", information["Encoding"]) + merge = auxiliaryFunctions.addSensitiveVariable(merge, "EncodeContent", "Encoding", information["Encoding"]) nifiConfiguration.create(nameCompose, merge) def createEncode(nifiConfiguration,information,name): nameCompose= nameActionReturn(information["action"],name) - merge=auxiliaryFunctions.prepareforAll("./template/Alterations/Encode_Decode.json",information) + merge=auxiliaryFunctions.prepareforAll("./template/alterations/Encode_Decode.json",information) if "Encoding" in information: - merge = auxiliaryFunctions.addSensibleVariable(merge, "EncodeContent", "Encoding", information["Encoding"]) + merge = auxiliaryFunctions.addSensitiveVariable(merge, "EncodeContent", "Encoding", information["Encoding"]) nifiConfiguration.create(nameCompose, merge) @@ -53,10 +53,10 @@ def createAlteration(nifiConfiguration,allInformation): createEncode(nifiConfiguration,alter,name) elif alter["action"]=="Decode": createDecode(nifiConfiguration,alter,name) - conectAlteration(nifiConfiguration,allInformation) + connectAlteration(nifiConfiguration,allInformation) -def conectAlteration(nifiConfiguration,allInformation): +def connectAlteration(nifiConfiguration,allInformation): name=allInformation["name"] for index,step in enumerate(allInformation["alterations"]): if index == 0: diff --git a/cli/apis/auxiliaryFunctions.py b/cli/apis/auxiliaryFunctions.py index f9470a3d..26cbf452 100644 --- a/cli/apis/auxiliaryFunctions.py +++ b/cli/apis/auxiliaryFunctions.py @@ -20,7 +20,7 @@ from alterations import alteration -def addSensibleVariable(file, processorName, key, value): +def addSensitiveVariable(file, processorName, key, value): for processor in file["flowContents"]["processors"]: if processor["name"] == processorName: processor["properties"][key] = value diff --git a/cli/apis/aws.py b/cli/apis/aws.py index e7b16bd8..9876521f 100644 --- a/cli/apis/aws.py +++ b/cli/apis/aws.py @@ -18,9 +18,9 @@ import json import os import boto3 -from apis.auxiliaryFunctions import addSensibleVariable +from apis.auxiliaryFunctions import addSensitiveVariable from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment def getAWSCredentials(configuration): if "AWS_ACCESS_KEY_ID" in os.environ and os.environ["AWS_ACCESS_KEY_ID"] != "" \ @@ -76,11 +76,11 @@ def createSQSQueue(configuration): def awsCredentialPreparefile(filecontent, configuration,processorName): - filecontent = addSensibleVariable(filecontent, processorName, "Access Key", + filecontent = addSensitiveVariable(filecontent, processorName, "Access Key", configuration["AWS_ACCESS_KEY_ID"]) - filecontent = addSensibleVariable(filecontent, processorName, "Secret Key", + filecontent = addSensitiveVariable(filecontent, processorName, "Secret Key", configuration["AWS_SECRET_ACCESS_KEY"]) - filecontent = addSensibleVariable(filecontent, processorName, "Region", + filecontent = addSensitiveVariable(filecontent, processorName, "Region", configuration["AWS_DEFAULT_REGION"]) return filecontent diff --git a/cli/apis/NifiManagment.py b/cli/apis/nifiManagment.py similarity index 100% rename from cli/apis/NifiManagment.py rename to cli/apis/nifiManagment.py diff --git a/cli/dcnios-cli.py b/cli/dcnios-cli.py index 9497af87..606defca 100644 --- a/cli/dcnios-cli.py +++ b/cli/dcnios-cli.py @@ -16,16 +16,13 @@ import yaml from yaml.loader import SafeLoader import json -#from oscar_python.client import Client from apis.auxiliaryFunctions import * -from apis.NifiManagment import * +from apis.nifiManagment import * from apis.aws import * from sources.dcache import * -from destinations.OSCAR import * -from sources.Kafka import * +from destinations.oscar import * +from sources.kafka import * from sources.generic import * -#import boto3 -#import os import env import argparse diff --git a/cli/destinations/OSCAR.py b/cli/destinations/oscar.py similarity index 98% rename from cli/destinations/OSCAR.py rename to cli/destinations/oscar.py index 1295de65..2b80b476 100644 --- a/cli/destinations/OSCAR.py +++ b/cli/destinations/oscar.py @@ -16,7 +16,7 @@ # !/usr/bin/env python3 from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment from oscar_python.client import Client diff --git a/cli/destinations/s3sqs.py b/cli/destinations/s3sqs.py index 3942298c..8646cc66 100644 --- a/cli/destinations/s3sqs.py +++ b/cli/destinations/s3sqs.py @@ -16,7 +16,7 @@ # !/usr/bin/env python3 from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment from apis import aws @@ -24,11 +24,11 @@ def createPutS3(nifiConfiguration,s3Info,s3content): if "MinIO_Endpoint" in s3Info: - auxiliaryFunctions.addSensibleVariable(s3content, "PutS3Object", "Endpoint Override URL", s3Info["MinIO_Endpoint"]) + auxiliaryFunctions.addSensitiveVariable(s3content, "PutS3Object", "Endpoint Override URL", s3Info["MinIO_Endpoint"]) s3Info["AWS_DEFAULT_REGION"]="us-east-1" else: aws.getAWSCredentials(s3Info) s3content = aws.awsCredentialPreparefile(s3content, s3Info,"PutS3Object") - auxiliaryFunctions.addSensibleVariable(s3content, "PutS3Object", "Bucket", s3Info["AWS_S3_BUCKET"]) + auxiliaryFunctions.addSensitiveVariable(s3content, "PutS3Object", "Bucket", s3Info["AWS_S3_BUCKET"]) nifiConfiguration.create(s3Info["name"], s3content) \ No newline at end of file diff --git a/cli/env.py b/cli/env.py index 1c0300ca..309560b1 100644 --- a/cli/env.py +++ b/cli/env.py @@ -16,18 +16,18 @@ from sources.dcache import createDcache from sources.s3sqs import createGetS3,createGetSQS -from sources.Kafka import createKafka +from sources.kafka import createKafka from sources.generic import createGeneric -from destinations.OSCAR import createOSCAR +from destinations.oscar import createOSCAR from destinations.s3sqs import createPutS3 -folderSource = "template/Sources/" +folderSource = "template/sources/" kafkafile = folderSource+"Kafka.json" dcachefile = folderSource+"dcache.json" sqsfile = folderSource+"SQS_recive.json" -folderDestination = "template/Destinations/" +folderDestination = "template/destinations/" putS3file= folderDestination+ "PutS3.json" oscarfile = folderDestination+"InvokeOSCAR.json" diff --git a/cli/sources/dcache.py b/cli/sources/dcache.py index 05dd191b..26798dba 100644 --- a/cli/sources/dcache.py +++ b/cli/sources/dcache.py @@ -16,7 +16,7 @@ # !/usr/bin/env python3 from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment diff --git a/cli/sources/generic.py b/cli/sources/generic.py index 446dac4e..79ad8285 100644 --- a/cli/sources/generic.py +++ b/cli/sources/generic.py @@ -16,7 +16,7 @@ # !/usr/bin/env python3 from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment def createGeneric(nifiConfiguration,genricInfo,genericcontent): nifiConfiguration.create(genricInfo["name"], genericcontent) diff --git a/cli/sources/Kafka.py b/cli/sources/kafka.py similarity index 76% rename from cli/sources/Kafka.py rename to cli/sources/kafka.py index f4b07e11..7701f415 100644 --- a/cli/sources/Kafka.py +++ b/cli/sources/kafka.py @@ -16,7 +16,7 @@ # !/usr/bin/env python3 from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment def createKafka(nifiConfiguration,kafkaInfo,kafkacontent): # Prepare config @@ -45,32 +45,32 @@ def createKafka(nifiConfiguration,kafkaInfo,kafkacontent): def kafkaPreparefile(filecontent, kafka): if "security_protocol" not in kafka: - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "security.protocol", "SASL_SSL") else: - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "security.protocol", kafka["security_protocol"]) if "" not in kafka: - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "sasl.mechanism", "PLAIN") else: - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "sasl.mechanism", kafka["sasl_mechanism"]) - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "sasl.username", kafka["sasl_username"]) - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "sasl.password", kafka["sasl_password"]) if "separate_by_key" in kafka and kafka["separate_by_key"] == "true": - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "separate-by-key", "true") - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "message-demarcator", kafka["message_demarcator"]) else: - filecontent = auxiliaryFunctions.addSensibleVariable(filecontent, "ConsumeKafka_2_6", + filecontent = auxiliaryFunctions.addSensitiveVariable(filecontent, "ConsumeKafka_2_6", "separate-by-key", "false") return filecontent diff --git a/cli/sources/s3sqs.py b/cli/sources/s3sqs.py index 8d704094..18a150c0 100644 --- a/cli/sources/s3sqs.py +++ b/cli/sources/s3sqs.py @@ -18,9 +18,8 @@ import json import os import boto3 -from apis.auxiliaryFunctions import addSensibleVariable from apis import auxiliaryFunctions -from apis import NifiManagment +from apis import nifiManagment from apis import aws diff --git a/cli/template/Alterations/Encode_Decode.json b/cli/template/alterations/Encode_Decode.json similarity index 100% rename from cli/template/Alterations/Encode_Decode.json rename to cli/template/alterations/Encode_Decode.json diff --git a/cli/template/Alterations/Merge.json b/cli/template/alterations/Merge.json similarity index 100% rename from cli/template/Alterations/Merge.json rename to cli/template/alterations/Merge.json diff --git a/cli/template/Destinations/InvokeOSCAR.json b/cli/template/destinations/InvokeOSCAR.json similarity index 100% rename from cli/template/Destinations/InvokeOSCAR.json rename to cli/template/destinations/InvokeOSCAR.json diff --git a/cli/template/Destinations/PutS3.json b/cli/template/destinations/PutS3.json similarity index 100% rename from cli/template/Destinations/PutS3.json rename to cli/template/destinations/PutS3.json diff --git a/cli/template/Sources/Kafka.json b/cli/template/sources/Kafka.json similarity index 100% rename from cli/template/Sources/Kafka.json rename to cli/template/sources/Kafka.json diff --git a/cli/template/Sources/SQS_recive.json b/cli/template/sources/SQS_recive.json similarity index 100% rename from cli/template/Sources/SQS_recive.json rename to cli/template/sources/SQS_recive.json diff --git a/cli/template/Sources/dcache.json b/cli/template/sources/dcache.json similarity index 100% rename from cli/template/Sources/dcache.json rename to cli/template/sources/dcache.json diff --git a/docpage/docs/03.- Sources/S3.md b/docpage/docs/03.- Sources/S3.md index 6abd81dd..dba3a0ee 100644 --- a/docpage/docs/03.- Sources/S3.md +++ b/docpage/docs/03.- Sources/S3.md @@ -3,7 +3,7 @@ sidebar_position: 3 --- # S3 -The S3 Source captures an ObjectCreated event from an AWS S3 bucket. DCNiOS creates S3 bucket event redirections to SQS queue. Then, Apache NiFi captures the event and introduces it to the dataflow. The whole pipeline is created using DCNiOS.But, SQS queue is deleted with DCNiOS, but the Event Notification in the S3 section needs to be removed manually. +The S3 Source captures an ObjectCreated event from an AWS S3 bucket. DCNiOS creates S3 bucket event redirections to SQS queue. Then, Apache NiFi captures the event and introduces it to the dataflow. The whole pipeline is created using DCNiOS. But, SQS queue is deleted with DCNiOS, but the Event Notification in the S3 section needs to be removed manually. The S3 Source requires: - An identifier name of the process. It must be unique. Required. diff --git a/docpage/docs/AWS.md b/docpage/docs/AWS.md index d3e551a2..909f1a7b 100644 --- a/docpage/docs/AWS.md +++ b/docpage/docs/AWS.md @@ -21,6 +21,6 @@ DCNiOS can use some AWS as input. A valid pair of AWS Access Key and AWS Secret AWS_DEFAULT_REGION is mandatory in any Source that uses AWS in the configuration file. These ProcessGroups can employ AWS credentials: -- [SQS](/docs/0.2.- Sources/SQS) -- [S3](/docs/0.2.- Sources/S3) +- [SQS](/docs/Sources/SQS) +- [S3](/docs/Sources/S3) diff --git a/docpage/docs/Introduction.md b/docpage/docs/Introduction.md index e6943e32..e47a3178 100644 --- a/docpage/docs/Introduction.md +++ b/docpage/docs/Introduction.md @@ -7,33 +7,25 @@ sidebar_position: 1 DCNiOS is an open-source command-line tool that easily manages the creation of event-driven data processing flows. DCNiOS reads a file with a workflow defined in a YAML structure. Then, DCNiOS creates this workflow in an Apache NiFi cluster. DCNiOS uses transparently the Apache NiFi [Process Groups](https://nifi.apache.org/docs/nifi-docs/html/user-guide.html#Configuring_a_ProcessGroup) to create predefined workflows. +![DCNiOS images](/../static/img/dcnios-logo-hor.png) + Apache NiFi Process Group is a group of Processors that compose a dataflow. DCNiOS uses predefined Process Groups that make simple actions like interacting with a third-party component (e.g., consuming from Kafka) or changing the data content (e.g.encoding the data in base64) to compose a complete dataflow. In DCNiOS documentation, the Process Groups are split by purpose into three main groups: 'Sources', 'Destinations', and 'Alterations'. - 'Sources' interact with a third-party component as the input data receiver. --'Destinations' interact with a third-party component as output data sender. +- 'Destinations' interact with a third-party component as output data sender. - 'Alterations' that do not interact with third-party components and change the format of the data flow. - - - - - - - - - - # Getting Started ## Prerequisites -- OSCAR cluster containing the user-defined OSCAR Services. [](https://github.com/grycap/oscar/tree/master/examples) -- Apache Nifi cluster deployed +- OSCAR cluster containing the user-defined OSCAR Services. You can see some [examples](https://github.com/grycap/oscar/tree/master/examples) in GitHub. +- Apache Nifi cluster deployed. - A Python distribution such as [Anaconda](https://www.anaconda.com/) or Python version 3.7.6 -- Input source (one of these is enough: dCache, Kafka, S3 AWS, SQS AWS) +- An input source (one of these is enough: dCache, Kafka, S3 AWS, SQS AWS) [IM](https://www.grycap.upv.es/im/index.php) can deploy a Kubernetes cluster that includes OSCAR and Apache NiFi. diff --git a/docpage/docs/Users.md b/docpage/docs/Users.md index 92f3d56d..037106ac 100644 --- a/docpage/docs/Users.md +++ b/docpage/docs/Users.md @@ -67,8 +67,9 @@ Destinations: - [OSCAR](https://oscar.grycap.net/) Alterations: -- Base64 - Merge +- Encoded +- Decoded #### Components Subsection diff --git a/docpage/docs/images/badge_software_silver.png b/docpage/docs/images/badge_software_silver.png new file mode 100644 index 00000000..95b1fe05 Binary files /dev/null and b/docpage/docs/images/badge_software_silver.png differ diff --git a/docpage/docs/images/dcnios-logo-hor.png b/docpage/docs/images/dcnios-logo-hor.png new file mode 100644 index 00000000..4aaa920a Binary files /dev/null and b/docpage/docs/images/dcnios-logo-hor.png differ diff --git a/docpage/docs/images/dcnios-logo-texto.png b/docpage/docs/images/dcnios-logo-texto.png new file mode 100644 index 00000000..06676707 Binary files /dev/null and b/docpage/docs/images/dcnios-logo-texto.png differ diff --git a/docpage/docs/images/dcnios-logo.png b/docpage/docs/images/dcnios-logo.png new file mode 100644 index 00000000..88ee8545 Binary files /dev/null and b/docpage/docs/images/dcnios-logo.png differ diff --git a/docpage/docusaurus.config.js b/docpage/docusaurus.config.js index 730d98a3..d2f9bd17 100644 --- a/docpage/docusaurus.config.js +++ b/docpage/docusaurus.config.js @@ -15,7 +15,7 @@ const config = { tagline: 'Data Connector through Apache NiFi for OSCAR', favicon: 'img/favicon.ico', - // Set the production url of your site here + // Set the production url of your site here #3f3f9f url: 'https://intertwin-eu.github.io/', // Set the // pathname under which your site is served // For GitHub pages deployment, it is often '//' @@ -72,7 +72,7 @@ const config = { title: toolName, logo: { alt: toolName+' Logo', - src: 'img/logo.svg', + src: 'img/dcnios-logo.png', }, items: [ { diff --git a/docpage/package-lock.json b/docpage/package-lock.json index 58d13f63..7da8c857 100644 --- a/docpage/package-lock.json +++ b/docpage/package-lock.json @@ -4283,11 +4283,11 @@ } }, "node_modules/braces": { - "version": "3.0.2", - "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz", - "integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==", + "version": "3.0.3", + "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.3.tgz", + "integrity": "sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==", "dependencies": { - "fill-range": "^7.0.1" + "fill-range": "^7.1.1" }, "engines": { "node": ">=8" @@ -6282,9 +6282,9 @@ } }, "node_modules/fill-range": { - "version": "7.0.1", - "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz", - "integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==", + "version": "7.1.1", + "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.1.1.tgz", + "integrity": "sha512-YsGpe3WHLK8ZYi4tWDg2Jy3ebRz2rXowDxnld4bkQB00cc/1Zw9AWnC0i9ztDJitivtQvaI9KaLyKrc+hBW0yg==", "dependencies": { "to-regex-range": "^5.0.1" }, @@ -14271,9 +14271,9 @@ } }, "node_modules/webpack-dev-server/node_modules/ws": { - "version": "8.16.0", - "resolved": "https://registry.npmjs.org/ws/-/ws-8.16.0.tgz", - "integrity": "sha512-HS0c//TP7Ina87TfiPUz1rQzMhHrl/SG2guqRcTOIUYD2q8uhUdNHZYJUaQ8aTGPzCh+c6oawMKW35nFl1dxyQ==", + "version": "8.17.1", + "resolved": "https://registry.npmjs.org/ws/-/ws-8.17.1.tgz", + "integrity": "sha512-6XQFvXTkbfUOZOKKILFG1PDK2NDQs4azKQl26T0YS5CxqWLgXajbPZ+h4gZekJyRqFU8pvnbAbbs/3TgRPy+GQ==", "engines": { "node": ">=10.0.0" }, @@ -14515,9 +14515,9 @@ } }, "node_modules/ws": { - "version": "7.5.9", - "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.9.tgz", - "integrity": "sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q==", + "version": "7.5.10", + "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.10.tgz", + "integrity": "sha512-+dbF1tHwZpXcbOJdVOkzLDxZP1ailvSxM6ZweXTegylPny803bFhA+vqBYw4s31NSAk4S2Qz+AKXK9a4wkdjcQ==", "engines": { "node": ">=8.3.0" }, diff --git a/docpage/src/pages/index.js b/docpage/src/pages/index.js index 4a42521c..33e4d8b2 100644 --- a/docpage/src/pages/index.js +++ b/docpage/src/pages/index.js @@ -8,6 +8,7 @@ import MainMarkdown from '@site/src/pages/markdown-page.md'; import Heading from '@theme/Heading'; import styles from './index.module.css'; + function HomepageHeader() { const {siteConfig} = useDocusaurusContext(); return ( diff --git a/docpage/src/pages/index.module.css b/docpage/src/pages/index.module.css index 9f71a5da..1ec329ff 100644 --- a/docpage/src/pages/index.module.css +++ b/docpage/src/pages/index.module.css @@ -8,6 +8,7 @@ text-align: center; position: relative; overflow: hidden; + background: linear-gradient(to right,#232391, #a3a3d1); ; } @media screen and (max-width: 996px) { diff --git a/docpage/src/pages/markdown-page.md b/docpage/src/pages/markdown-page.md index 391b08ee..1cad08ea 100644 --- a/docpage/src/pages/markdown-page.md +++ b/docpage/src/pages/markdown-page.md @@ -4,10 +4,9 @@ title: Main Page DCNiOS is an open-source command-line tool to easily manage the creation of event-driven data processing flows. -DCNiOS, Data Connector through [Apache NiFi](https://nifi.apache.org/) for [OSCAR](https://oscar.grycap.net/), facilitates the creation of event-driven processes connecting a Storage System like [dCache](https://www.dcache.org/) to a scalable OSCAR cluster by employing predefined dataflows that are processed by Apache NiFi. +DCNiOS, Data Connector through [Apache NiFi](https://nifi.apache.org/) for [OSCAR](https://oscar.grycap.net/), facilitates the creation of event-driven processes connecting a Storage System like [dCache](https://www.dcache.org/) or [S3](https://aws.amazon.com/s3) to a scalable OSCAR cluster by employing predefined dataflows that are processed by Apache NiFi. - -DCNiOS was developed within the interTwin project. DCNiOS captures events, queues them up in a Nifi dataflow, and ingests them in an OSCAR cluster at a customized rate, where an OSCAR service is run based on a user-defined application (containerized in a Docker image). +DCNiOS was developed within the interTwin project. DCNiOS creates dataflows in Apache NiFi that captures events or messages, and ingests them in an OSCAR cluster at a customized rate, where an OSCAR service is run based on a user-defined application (containerized in a Docker image). The DCNiOS command-line application is available in the Source Code repository. Additionally, the corresponding TOSCA templates and the ansible roles that are required to deploy an Apache Nifi cluster via the Infrastructure Manager (IM) have been provided. Any user can self-deploy such a cluster via the [IM Dashboard](https://im.egi.eu). diff --git a/docpage/static/img/dcnios-logo-hor.png b/docpage/static/img/dcnios-logo-hor.png new file mode 100644 index 00000000..4aaa920a Binary files /dev/null and b/docpage/static/img/dcnios-logo-hor.png differ diff --git a/docpage/static/img/dcnios-logo-texto.png b/docpage/static/img/dcnios-logo-texto.png new file mode 100644 index 00000000..06676707 Binary files /dev/null and b/docpage/static/img/dcnios-logo-texto.png differ diff --git a/docpage/static/img/dcnios-logo.png b/docpage/static/img/dcnios-logo.png new file mode 100644 index 00000000..88ee8545 Binary files /dev/null and b/docpage/static/img/dcnios-logo.png differ diff --git a/docpage/static/img/docusaurus.png b/docpage/static/img/docusaurus.png deleted file mode 100644 index f458149e..00000000 Binary files a/docpage/static/img/docusaurus.png and /dev/null differ diff --git a/docpage/static/img/favicon.ico b/docpage/static/img/favicon.ico index c01d54bc..88ee8545 100644 Binary files a/docpage/static/img/favicon.ico and b/docpage/static/img/favicon.ico differ