Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redesign Build Deployment Process (External) #961

Merged
merged 96 commits into from
Aug 12, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
96 commits
Select commit Hold shift + click to select a range
93c2232
Read secret_list from actual file if present, else use sample file.
Mar 21, 2024
a80d7e4
Matched conf/log files with internal repo
Mar 26, 2024
20e1856
Reading push.json values from environment variables
Mar 29, 2024
4a1005e
Choose analysis/debug.conf file based on ENV var
Apr 1, 2024
d4dad4a
Testing method to share tag between repos
nataliejschultz Apr 4, 2024
02c7fc7
Changed webserver.conf.json to Environment variables + Removed sed / …
Apr 10, 2024
4aeb4a6
Corrected logic to set test webserver ENV variables
Apr 11, 2024
2531672
Changed Webserver.conf + Db.conf to Environment variable + Removed se…
Apr 12, 2024
ea57afa
Reverting Natalie testing artifact changes + Adding image-push-merge
Apr 12, 2024
e8344e9
Fixes for failing TestTokenQueries print assertions
Apr 12, 2024
7f0d5f0
Fixes for failing TestTokenQueries print assertions
Apr 12, 2024
05551c8
Try-except block brought to the top
Apr 16, 2024
cabc98a
Merge pull request #2 from MukuFlash03/consolidate-differences
MukuFlash03 Apr 16, 2024
c39b537
Removed extraneous seed_model.json
Apr 16, 2024
385bc97
Merge pull request #3 from MukuFlash03/consolidate-differences
MukuFlash03 Apr 16, 2024
273c2ca
TODO added to change to master branch in YML file
Apr 24, 2024
2fdb469
Adding image-push-merge branch to automated CI/CD tests
Apr 24, 2024
91abdea
Upload artifact test - 1
Apr 25, 2024
bbdaaae
Upload artifact test - 2
Apr 25, 2024
5d0ca02
Added temporary test file
Apr 25, 2024
2e8a911
Changes to actions + echo
nataliejschultz Apr 25, 2024
f5aae47
Upload artifact test - 3
Apr 25, 2024
051ef66
Repository dispatch send - 1
Apr 26, 2024
6c44865
Workflow dispatch send - 1
Apr 26, 2024
4569dfb
Workflow dispatch send - 2
Apr 26, 2024
b562c2c
Workflow dispatch send - 3
Apr 26, 2024
25109b3
Workflow dispatch send - 3
Apr 26, 2024
709f3cf
Workflow dispatch send - 4
Apr 26, 2024
4b42185
Workflow dispatch send - 5
Apr 26, 2024
12d09ae
Workflow dispatch send - 6
Apr 26, 2024
94c129b
Workflow dispatch send - 7
Apr 26, 2024
0dd1245
Matrix build send - 1
Apr 26, 2024
5434914
Matrix build send - 2
Apr 26, 2024
0fd3e9d
Matrix build send - 3
Apr 26, 2024
ab399ea
Matrix build send - 3
Apr 26, 2024
91c0c1f
Matrix build send - 4
Apr 26, 2024
acecf3e
Matrix build send - 5
Apr 26, 2024
e778b3f
Fix for "url" KeyError observed in public-dash redesign testing
Apr 30, 2024
a0190d4
Fix for "url" KeyError observed in public-dash redesign testing
Apr 30, 2024
776f0b9
Artifact + Matrix - 1
May 2, 2024
17ac3cc
Artifact + Matrix - 2
May 2, 2024
706f74c
Artifact + Matrix - 3
May 2, 2024
d724fd1
Revert "Adding image-push-merge branch to automated CI/CD tests"
May 2, 2024
e6a2d79
Revert "TODO added to change to master branch in YML file"
May 2, 2024
4b3dfdf
Merge branch 'image-push-merge' into tags-combo-approach
May 2, 2024
f033f06
Added TODOs in github actions workflow YAML file
May 3, 2024
1207d79
Artifact + Matrix - 4
May 3, 2024
f306e8c
Merge branch 'consolidate-differences' into tags-combo-approach
nataliejschultz May 6, 2024
00d9565
Merge pull request #4 from MukuFlash03/tags-combo-approach
nataliejschultz May 6, 2024
f182790
Cleanup changes
nataliejschultz May 9, 2024
f1869ab
Update image_build_push.yml
nataliejschultz May 9, 2024
8f05955
More cleanup + testing image build?
nataliejschultz May 9, 2024
912bd34
Hardcoded webhost
nataliejschultz May 16, 2024
738b629
secret.py
nataliejschultz May 17, 2024
7102508
Restore intake.conf.sample
nataliejschultz May 17, 2024
3d77439
Reverting webserver.conf.sample
nataliejschultz May 17, 2024
a0f2424
Removing check_unset_env_vars
nataliejschultz May 17, 2024
9fb0e5a
Merge branch 'consolidate-differences' of https://github.com/MukuFlas…
nataliejschultz May 17, 2024
18a8872
Removing check_unset_env_vars functionality
nataliejschultz May 17, 2024
53015c7
Update image_build_push.yml
nataliejschultz May 21, 2024
41a410c
Update docker_start_script.sh
nataliejschultz May 21, 2024
7405ff1
Setting DB_HOST=db
nataliejschultz May 21, 2024
cd62247
Update docker_start_script.sh
nataliejschultz May 22, 2024
ebc8188
Rename debug.conf.internal.json to debug.conf.prod.json
nataliejschultz May 22, 2024
41ae79f
Update config.py
nataliejschultz May 22, 2024
290b0fc
Push to rename
nataliejschultz May 22, 2024
29869cd
Update and rename debug.conf.json.sample to debug.conf.dev.json
nataliejschultz May 22, 2024
38209ef
common.py fix?
nataliejschultz May 22, 2024
da485c2
Removing redundant DB_HOST setting
nataliejschultz May 23, 2024
e6b388b
reverting dockerfile + start script changes
nataliejschultz May 24, 2024
bb03c42
Testing workflows with compose
nataliejschultz May 26, 2024
cf50d17
Triggering workflows.
nataliejschultz May 26, 2024
6a21f5d
Reverting image_build_push.yml
nataliejschultz May 26, 2024
b961417
Not showing changes to branches for some reason in image_build_push.yml
nataliejschultz Jun 10, 2024
a245e7d
Adding comment to see if it resolves github display error
nataliejschultz Jun 17, 2024
2067055
🩹 Don't cat the db.conf file
shankari Aug 7, 2024
3000758
🩹 Use the correct filename in the gitignore
shankari Aug 7, 2024
6a8a13f
🩹 Set the default value for the `DB_HOST` as well
shankari Aug 7, 2024
fc183cb
🩹 Unify the supported user inputs across the debug and prod configs
shankari Aug 7, 2024
51e16de
🩹 Remove the cat of db.conf from the integration tests as well
shankari Aug 7, 2024
7ab37b6
🩹 Cleanup environment variables in the basic start script
shankari Aug 7, 2024
727c00c
♻️ Move the config to a different file name that makes more sense
shankari Aug 10, 2024
7f1be92
♻️ Refactor the backwards config file to be reusable
shankari Aug 10, 2024
7177e71
🔊 log the full backtrace if the config file is formatted incorrectly
shankari Aug 10, 2024
168ef10
♻️ Move the api configuration into the backwards compat as well
shankari Aug 10, 2024
3dea305
♻️ Move the api configuration into the backwards compat as well
shankari Aug 10, 2024
a0f0c6a
♻️ Move the push configuration into the backwards compat as well
shankari Aug 12, 2024
10624a6
🔊 Indicate that we are using the default production config
shankari Aug 12, 2024
0fe9476
Merge branch 'consolidate-differences' of https://github.com/MukuFlas…
shankari Aug 12, 2024
11d2a89
♻️ Pull out the code to reset the environment variable overrides to …
shankari Aug 12, 2024
4cc4c58
✅ Fix the expected text while checking for tokens
shankari Aug 12, 2024
357d4b8
🔥 Remove the copied over config file
shankari Aug 12, 2024
b6f59b0
♻️ Access the environment variables from the config using `.get`
shankari Aug 12, 2024
ec38835
✅ Remove environment variables that are likely to be different across…
shankari Aug 12, 2024
1a0d451
✅ Delete all irrelevant config variables
shankari Aug 12, 2024
2fe0816
✅ Copy/paste the actual tests from the failed CI run
shankari Aug 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 0 additions & 23 deletions .docker/docker_start_script.sh
Original file line number Diff line number Diff line change
@@ -1,27 +1,4 @@
#!/usr/bin/env bash
#Configure web server

# cd /usr/src/app/e-mission-server

#set database URL using environment variable
echo ${DB_HOST}
if [ -z ${DB_HOST} ] ; then
local_host=`hostname -i`
jq --arg db_host "$local_host" '.timeseries.url = $db_host' conf/storage/db.conf.sample > conf/storage/db.conf
else
jq --arg db_host "$DB_HOST" '.timeseries.url = $db_host' conf/storage/db.conf.sample > conf/storage/db.conf
fi
cat conf/storage/db.conf

#set Web Server host using environment variable
echo ${WEB_SERVER_HOST}
if [ -z ${WEB_SERVER_HOST} ] ; then
local_host=`hostname -i`
sed "s_localhost_${local_host}_" conf/net/api/webserver.conf.sample > conf/net/api/webserver.conf
else
sed "s_localhost_${WEB_SERVER_HOST}_" conf/net/api/webserver.conf.sample > conf/net/api/webserver.conf
fi
cat conf/net/api/webserver.conf

if [ -z ${LIVERELOAD_SRC} ] ; then
echo "Live reload disabled, "
Expand Down
52 changes: 41 additions & 11 deletions .github/workflows/image_build_push.yml
Original file line number Diff line number Diff line change
@@ -1,29 +1,22 @@
# This is a basic workflow to help you get started with Actions

name: docker image

# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
push:
branches: [ master, gis-based-mode-detection ]


# Env variable
#Dockerhub credentials are set as environment variables
env:
DOCKER_USER: ${{secrets.DOCKER_USER}}
DOCKER_PASSWORD: ${{secrets.DOCKER_PASSWORD}}

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest

# Steps represent a sequence of tasks that will be executed as part of the job
outputs:
date: ${{ steps.date.outputs.date }}

steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- name: docker login
run: | # log into docker hub account
Expand All @@ -46,3 +39,40 @@ jobs:
- name: push docker image
run: |
docker push $DOCKER_USER/${GITHUB_REPOSITORY#*/}:${GITHUB_REF##*/}_${{ steps.date.outputs.date }}

- name: Create a text file
run: |
echo ${{ steps.date.outputs.date }} > tag_file.txt
echo "Created tag text file"

- name: Upload Artifact
uses: actions/upload-artifact@v4
with:
name: docker-image-tag
path: tag_file.txt
overwrite: true

Comment on lines +48 to +53
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we need the tag_file.txt to be uploaded here given that we are passing it directly to the workflows on line 77?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See Mukul's comment in the issue:

Why I chose to add artifact method as well?

The issue I was facing was with fetching the latest timestamp for the image tag in case of a push event trigger. This is because in the workflow dispatch, the server workflow itself would trigger the workflows and hence was in a way connected to these workflows. However, push events would only trigger the specific workflow in that specific dashboard repository to build and push the image and hence would not be able to retrieve the image tag directly.

So, I utilized the artifact upload and download method to:

  • upload the image timestamp as an artifact in the workflow run for future use.
  • download the uploaded artifact from the latest previously successful and completed workflow run in e-mission-server repo for a specific branch (currently set to tags-combo-approach but to be changed to master once changes are final).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see the issue here. You have to deal with dispatch and push separately anyway - one of them will have the image tag and the other will not. And given that we have the .env file checked in now, the push can just use that directly. I don't see why we need Yet Another file being uploaded from the workflow. Having said that, I am not going to hold up this PR for this, but it needs to be addressed as a polishing change.

Copy link
Contributor Author

@MukuFlash03 MukuFlash03 Aug 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do see now why we do not need the artifact for the push event. Detailed comments made here in public-dash Redesign PR. The comments talk about the corresponding artifact download in the dashboard repos and the explanation is applicable to the upload artifact on the server side as well.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MukuFlash03 I utilize the artifacts to get the tags to the internal repo, so please don't remove them yet!!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nataliejschultz For now, I've tested removing only the Download artifacts step in the public-dashboard workflow.
I don't think it is related to the Upload artifacts step that you added for internal repo.
Neither am I making changes to the server workflow.

Can you please confirm if in this case it is alright to remove the artifact dealing with Push event from the dashboard workflows?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MukuFlash03 sorry, I misunderstood what you were changing. It is okay with me to remove the download artifacts step!

dispatch:
needs: build
runs-on: ubuntu-latest

env:
DOCKER_IMAGE_TAG: ${{ needs.build.outputs.date }}

strategy:
matrix:
repo: ['e-mission/op-admin-dashboard', 'e-mission/em-public-dashboard']

steps:
- uses: actions/checkout@v4

- name: Trigger workflow in admin-dash, public-dash
# TODO: Create Fine-grained token with "Actions: write" permissions
run: |
shankari marked this conversation as resolved.
Show resolved Hide resolved
curl -L \
-X POST \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer ${{ secrets.GH_FG_PAT_TAGS }}" \
-H "X-GitHub-Api-Version: 2022-11-28" \
https://api.github.com/repos/${{ matrix.repo }}/actions/workflows/image_build_push.yml/dispatches \
-d '{"ref":"master", "inputs": {"docker_image_tag" : "${{ env.DOCKER_IMAGE_TAG }}"}}'
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ CFC_DataCollector/moves_collect.log
webapp/www/lib
conf/**/*.json
!conf/**/*.schema.json
!conf/analysis/debug.conf.dev.json
!conf/analysis/debug.conf.prod.json

*.ipynb_checkpoints*

Expand Down
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ RUN chmod u+x ./.docker/setup_config.sh
RUN bash -c "./.docker/setup_config.sh"

# #declare environment variables
ENV DB_HOST=''
ENV WEB_SERVER_HOST=''
ENV DB_HOST='db'
ENV WEB_SERVER_HOST=0.0.0.0

ENV LIVERELOAD_SRC=''
ENV STUDY_CONFIG=''
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,5 +10,5 @@
"section.startStopRadius": 150,
"section.endStopRadius": 150,
"analysis.result.section.key": "analysis/inferred_section",
"userinput.keylist": ["manual/mode_confirm", "manual/purpose_confirm", "manual/trip_user_input", "manual/place_user_input"]
"userinput.keylist": ["manual/mode_confirm", "manual/purpose_confirm", "manual/replaced_mode", "manual/trip_user_input", "manual/place_user_input"]
}
14 changes: 14 additions & 0 deletions conf/analysis/debug.conf.prod.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
{
"intake.segmentation.section_segmentation.sectionValidityAssertions": true,
"intake.cleaning.clean_and_resample.speedDistanceAssertions": false,
"intake.cleaning.clean_and_resample.sectionValidityAssertions": false,
"intake.cleaning.filter_accuracy.enable": false,
"classification.inference.mode.useAdvancedFeatureIndices": true,
"classification.inference.mode.useBusTrainFeatureIndices": true,
"classification.validityAssertions": true,
"output.conversion.validityAssertions": true,
"section.startStopRadius": 150,
"section.endStopRadius": 150,
"analysis.result.section.key": "analysis/inferred_section",
"userinput.keylist": ["manual/mode_confirm", "manual/purpose_confirm", "manual/replaced_mode", "manual/trip_user_input", "manual/place_user_input"]
}
10 changes: 8 additions & 2 deletions emission/analysis/config.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,17 @@
import json
import os

def get_config_data():
try:
print("Trying to open debug.conf.json")
config_file = open('conf/analysis/debug.conf.json')
Comment on lines +6 to 7
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not fully convinced that this is the right approach. If we keep this backwards compat code around forever, we don't have any motivation for people to change to the new structure in the future. Having said that, I will not hold up the merge for this, but I do want to make sure that we have a plan to remove this in a year or so, and to notify users to change their config files, so that we don't end up with a bunch of hacky backwards compat code strewn all around the codebase.

except:
print("analysis.debug.conf.json not configured, falling back to sample, default configuration")
config_file = open('conf/analysis/debug.conf.json.sample')
if os.getenv("PROD_STAGE") == "TRUE":
print("In production environment, config not overridden, using default production debug.conf")
config_file = open('conf/analysis/debug.conf.prod.json')
else:
print("analysis.debug.conf.json not configured, falling back to sample, default configuration")
config_file = open('conf/analysis/debug.conf.dev.json')
ret_val = json.load(config_file)
config_file.close()
return ret_val
Expand Down
42 changes: 42 additions & 0 deletions emission/core/backwards_compat_config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
import json
import logging
import os
import numpy as np
import pandas as pd

# if there is a config file and the environment variable is set, we need to
# decide which one wins. I would argue for the environment variable, to allow
# for a migration to the new model and for us to remove the obsolete code.
# Although arguably, the converse will also work, since we can set the
# variable while the file is present, and then remove the file in a second
# round of changes. Let's keep the order unchanged for now for simplicity, and
# modify as needed later.

def get_config(config_file_name, var_path_mapping):
# Since a `config_data` field would be at the module level, and we want
# the module to be reusable, we are not going to cache the result. It is
# not clear that we need to cache the result anyway, given that we
# typically initialize the config variables at the beginning of the
# modules in which they are used. If we feel like this is an issue, we can
# switch to creating a class instead.
ret_val = {}
try:
config_file = open(config_file_name)
# we only have a single entry in the config json, not an array
# and there is no way for json_normalize to return a series
# so we will just take the first row of the dataframe
loaded_val = pd.json_normalize(json.load(config_file)).iloc[0]
for var, path in var_path_mapping.items():
ret_val[var] = loaded_val[path]
# Ensure that the returned values are regular ints
# https://github.com/e-mission/e-mission-server/pull/961#issuecomment-2282206511
if type(ret_val[var]) is np.int64:
ret_val[var] = int(ret_val[var])
config_file.close()
except Exception as e:
if isinstance(e, KeyError) or isinstance(e, json.decoder.JSONDecodeError):
logging.exception(e)
print("Config file not found, returning a copy of the environment variables instead...")
# https://github.com/e-mission/e-mission-server/pull/961#issuecomment-2282209006
ret_val = dict(os.environ)
return ret_val
16 changes: 7 additions & 9 deletions emission/core/get_database.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,16 +10,14 @@
import os
import json

try:
config_file = open('conf/storage/db.conf')
except:
print("storage not configured, falling back to sample, default configuration")
config_file = open('conf/storage/db.conf.sample')
import emission.core.backwards_compat_config as ecbc

config = ecbc.get_config('conf/storage/db.conf',
{"DB_HOST": "timeseries.url", "DB_RESULT_LIMIT": "timeseries.result_limit"})

config_data = json.load(config_file)
url = config_data["timeseries"]["url"]
result_limit = config_data["timeseries"]["result_limit"]
config_file.close()
print("Retrieved config %s" % config)
url = config.get("DB_HOST", "localhost")
result_limit = config.get("DB_RESULT_LIMIT", 250000)

try:
parsed=pymongo.uri_parser.parse_uri(url)
Expand Down
10 changes: 1 addition & 9 deletions emission/integrationTests/start_integration_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,7 @@
# Using an automated install
cd /src/e-mission-server

#set database URL using environment variable
echo ${DB_HOST}
if [ -z ${DB_HOST} ] ; then
local_host=`hostname -i`
sed "s_localhost_${local_host}_" conf/storage/db.conf.sample > conf/storage/db.conf
else
sed "s_localhost_${DB_HOST}_" conf/storage/db.conf.sample > conf/storage/db.conf
fi
cat conf/storage/db.conf

echo "Setting up conda..."
source setup/setup_conda.sh Linux-x86_64
Expand All @@ -25,4 +17,4 @@ echo "Adding permissions for the runIntegrationTests.sh script"
chmod +x runIntegrationTests.sh
echo "Permissions added for the runIntegrationTests.sh script"

./runIntegrationTests.sh
./runIntegrationTests.sh
24 changes: 17 additions & 7 deletions emission/integrationTests/storageTests/TestMongodbAuth.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,10 +47,15 @@ def setUp(self):
self.uuid = uuid.uuid4()
self.testUserId = self.uuid
self.db_conf_file = "conf/storage/db.conf"
self.originalDBEnvVars = {}
self.createAdmin()

def tearDown(self):
self.admin_auth.command({"dropAllUsersFromDatabase": 1})
logging.debug("Deleting test db environment variables")
ecc.restoreOriginalEnvVars(self.originalDBEnvVars, self.modifiedEnvVars)
logging.debug("Finished restoring original db environment variables")
logging.debug("Restored original values are = %s" % self.originalDBEnvVars)
try:
os.remove(self.db_conf_file)
except FileNotFoundError as e:
Expand All @@ -67,14 +72,19 @@ def createAdmin(self):
self.admin_auth = pymongo.MongoClient(self.getURL(self.test_username, self.test_password)).admin

def configureDB(self, url):
config = {
"timeseries": {
"url": url,
"result_limit": 250000
}
self.testModifiedEnvVars = {
'DB_HOST' : url
}
with open(self.db_conf_file, "w") as fp:
json.dump(config, fp, indent=4)

self.orginalDBEnvVars = dict(os.environ)

for env_var_name, env_var_value in self.testModifiedEnvVars.items():
# Setting db environment variables with test values
os.environ[env_var_name] = env_var_value

logging.debug("Finished setting up test db environment variables")
logging.debug("Current original values are = %s" % self.originalDBEnvVars)
logging.debug("Current modified values are = %s" % self.testModifiedEnvVars)

def getURL(self, username, password, dbname="admin"):
return "mongodb://%s:%s@localhost/%s?authSource=admin&authMechanism=SCRAM-SHA-1" % (username, password, dbname)
Expand Down
37 changes: 15 additions & 22 deletions emission/net/api/cfc_webapp.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,27 +51,22 @@
import emission.storage.timeseries.cache_series as esdc
import emission.core.timer as ect
import emission.core.get_database as edb
import emission.core.backwards_compat_config as ecbc

try:
config_file = open('conf/net/api/webserver.conf')
except:
logging.debug("webserver not configured, falling back to sample, default configuration")
config_file = open('conf/net/api/webserver.conf.sample')

OPENPATH_URL="https://www.nrel.gov/transportation/openpath.html"
STUDY_CONFIG = os.getenv('STUDY_CONFIG', "stage-program")

config_data = json.load(config_file)
config_file.close()
static_path = config_data["paths"]["static_path"]
python_path = config_data["paths"]["python_path"]
server_host = config_data["server"]["host"]
server_port = config_data["server"]["port"]
socket_timeout = config_data["server"]["timeout"]
log_base_dir = config_data["paths"]["log_base_dir"]
auth_method = config_data["server"]["auth"]
aggregate_call_auth = config_data["server"]["aggregate_call_auth"]
not_found_redirect = config_data["paths"].get("404_redirect", OPENPATH_URL)
# Constants that we don't read from the configuration
WEBSERVER_STATIC_PATH="webapp/www"
WEBSERVER_HOST="0.0.0.0"

config = ecbc.get_config('conf/net/api/webserver.conf',
{"WEBSERVER_PORT": "server.port", "WEBSERVER_TIMEOUT": "server.timeout",
"WEBSERVER_AUTH": "server.auth", "WEBSERVER_AGGREGATE_CALL_AUTH": "server.aggregate_call_auth"})
server_port = config.get("WEBSERVER_PORT", 8080)
socket_timeout = config.get("WEBSERVER_TIMEOUT", 3600)
auth_method = config.get("WEBSERVER_AUTH", "skip")
aggregate_call_auth = config.get("WEBSERVER_AGGREGATE_CALL_AUTH", "no_auth")
not_found_redirect = config.get("WEBSERVER_NOT_FOUND_REDIRECT", "https://nrel.gov/openpath")

BaseRequest.MEMFILE_MAX = 1024 * 1024 * 1024 # Allow the request size to be 1G
# to accomodate large section sizes
Expand All @@ -89,7 +84,7 @@
#Simple path that serves up a static landing page with javascript in it
@route('/')
def index():
return static_file("index.html", static_path)
return static_file("index.html", WEBSERVER_STATIC_PATH)

# Backward compat to handle older clients
# Remove in 2023 after everybody has upgraded
Expand Down Expand Up @@ -558,6 +553,4 @@ def resolve_auth(auth_method):
else:
# Non SSL option for testing on localhost
print("Running with HTTPS turned OFF - use a reverse proxy on production")
run(host=server_host, port=server_port, server='cheroot', debug=True)

# run(host="0.0.0.0", port=server_port, server='cherrypy', debug=True)
run(host=WEBSERVER_HOST, port=server_port, server='cheroot', debug=True)
6 changes: 5 additions & 1 deletion emission/net/auth/secret.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,11 @@

class SecretMethod(object):
def __init__(self):
key_file = open('conf/net/auth/secret_list.json')
try:
key_file = open('conf/net/auth/secret_list.json')
except:
print("secret_list.json not configured, falling back to sample, default configuration")
key_file = open('conf/net/auth/secret_list.json.sample')
key_data = json.load(key_file)
key_file.close()
self.client_secret_list = key_data["client_secret_list"]
Expand Down
12 changes: 8 additions & 4 deletions emission/net/ext_service/push/notify_interface.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,22 +11,26 @@
import logging
import importlib

import emission.core.backwards_compat_config as ecbc

# Note that the URL is hardcoded because the API endpoints are not standardized.
# If we change a push provider, we will need to modify to match their endpoints.
# Hardcoding will remind us of this :)
# We can revisit this if push providers eventually decide to standardize...

push_config = ecbc.get_config('conf/net/ext_service/push.json',
{"PUSH_PROVIDER": "provider", "PUSH_SERVER_AUTH_TOKEN": "server_auth_token",
"PUSH_APP_PACKAGE_NAME": "app_package_name", "PUSH_IOS_TOKEN_FORMAT": "ios_token_format"})

try:
push_config_file = open('conf/net/ext_service/push.json')
push_config = json.load(push_config_file)
push_config_file.close()
logging.info(f"Push configured for app {push_config.get('PUSH_SERVER_AUTH_TOKEN')} using platform {os.getenv('PUSH_PROVIDER')} with token {os.getenv('PUSH_SERVER_AUTH_TOKEN')[:10]}... of length {len(os.getenv('PUSH_SERVER_AUTH_TOKEN'))}")
except:
logging.warning("push service not configured, push notifications not supported")

class NotifyInterfaceFactory(object):
@staticmethod
def getDefaultNotifyInterface():
return NotifyInterfaceFactory.getNotifyInterface(push_config["provider"])
return NotifyInterfaceFactory.getNotifyInterface(push_config.get("PUSH_PROVIDER"))

@staticmethod
def getNotifyInterface(pushProvider):
Expand Down
Loading
Loading