Skip to content

Commit

Permalink
Merge pull request #38 from laijasmine/main
Browse files Browse the repository at this point in the history
Patches to the awards bot functions
  • Loading branch information
Jasmine authored Apr 20, 2021
2 parents 9c26645 + c81e459 commit 8c06358
Show file tree
Hide file tree
Showing 9 changed files with 117 additions and 23 deletions.
5 changes: 3 additions & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,10 @@ Version: 1.0
Date: 2018-06-18
Authors@R: c(
person("Dominic", "Mullen", email = "[email protected]", role = c("cre", "aut")),
person("Mitchell", "Maier", email = "[email protected]", role = c("ctb"))
person("Mitchell", "Maier", email = "[email protected]", role = c("ctb")),
person("Jasmine", "Lai", email = "[email protected]", role = c("ctb"))
)
Maintainer: Dominic Mullen <[email protected]>
Maintainer: Jasmine Lai <[email protected]>
Description: Creates an NSF awards database for the Arctic Data Center. Sends periodic correspondence to awardees.
License: Apache License (== 2.0)
Depends:
Expand Down
2 changes: 2 additions & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

export(get_awards)
export(import_awards_db)
export(main)
export(test_main)
export(update_awards)
import(XML)
import(stringr)
Expand Down
31 changes: 30 additions & 1 deletion R/main.R
Original file line number Diff line number Diff line change
@@ -1,4 +1,17 @@
## TODO add a slack message that details every email sent per day?
#' main function
#'
#' @param database_path (character) Path to the database of awards and correspondences
#' @param lastrun_path (character) Determines where the bot stores its state
#' @param current_date (date) today's date
#' @param annual_report_time (numeric) time in months after 'start_date' to send the first annual report reminder
#' @param initial_aon_offset (numeric)Number of months after award startDate to send first AON data due reminder
#' @param aon_recurring_interval (numeric) Number of months to send recurring emails for AON data due
#'
#' @return
#' @export
#'
#' @examples
main <- function(database_path = Sys.getenv('DATABASE_PATH'),
lastrun_path = Sys.getenv('LASTRUN_PATH'),
current_date = as.character(Sys.Date()),
Expand Down Expand Up @@ -29,7 +42,23 @@ main <- function(database_path = Sys.getenv('DATABASE_PATH'),
return(invisible())
}

# Wrapper for main, with additional email testing argument.
#
#' Test main function
#' Wrapper for main, with additional email testing argument.
#' Uses a dummy database that will send out 2 tickets using the email specified
#'
#' @param database_path (character) Path to the database of awards and correspondences
#' @param lastrun_path (character) Determines where the bot stores its state
#' @param current_date (date) today's date
#' @param annual_report_time (numeric) time in months after 'start_date' to send the first annual report reminder
#' @param initial_aon_offset (numeric)Number of months after award startDate to send first AON data due reminder
#' @param aon_recurring_interval (numeric) Number of months to send recurring emails for AON data due
#' @param email The email to send the tickets to
#'
#' @return
#' @export
#'
#' @examples
test_main <- function(database_path = Sys.getenv('DATABASE_PATH'),
lastrun_path = Sys.getenv('LASTRUN_PATH'),
current_date = as.character(Sys.Date()),
Expand Down
3 changes: 0 additions & 3 deletions example_db.csv

This file was deleted.

2 changes: 1 addition & 1 deletion inst/emails/contact_initial_ans
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,6 @@ We also wanted to highlight the data portals feature at the Arctic Data Center.

If you have any questions about the Arctic Data Center, NSF requirements, or need assistance submitting data and/or metadata please reply to this email and we will respond as soon as possible.

Good luck on your research!
Best wishes with your research!

The Arctic Data Center Support Team
2 changes: 1 addition & 1 deletion inst/emails/contact_initial_aon
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,6 @@ We also wanted to highlight the data portals feature at the Arctic Data Center.

If you have any questions about the Arctic Data Center, NSF requirements, or need assistance submitting data and/or metadata please reply to this email and we will respond as soon as possible.

Good luck on your research!
Best wishes with your research!

The Arctic Data Center Support Team
20 changes: 5 additions & 15 deletions inst/emails/contact_initial_social_sciences
Original file line number Diff line number Diff line change
@@ -1,23 +1,13 @@
Dear %s,

Congratulations on your new NSF Arctic Sciences award %s %s.
Congratulations on your new NSF Arctic Sciences award %s %s. The NSF-funded Arctic Data Center support team is passionate about preserving Arctic research. We are here to support researchers like yourself in storing and sharing valuable data.

The NSF-funded Arctic Data Center support team is passionate about Arctic science. We are here to support researchers like yourself in storing and sharing valuable data.
We are contacting you today because projects funded by the NSF ARC Arctic Social Sciences Program (ASSP) have specific requirements for data and metadata storage. At this time, NSF requires all data and metadata created during your project be submitted to a national data center or another long-lived, publicly accessible archive within two years of collection or before the end of your award, whichever comes first. The Arctic Data Center is a long-lived, publicly accessible archive that was created to help researchers meet the NSF data and metadata storage requirements. Please note that NSF policies include special exceptions for awards that contain sensitive data, including human subjects data and data that are governed by an Institutional Review Board policy. In those cases, data and metadata should be published with only the non-sensitive aspects of the study. More detailed information about what will be expected can be found at: https://arcticdata.io/submit/#who-must-submit.

We are contacting you today because projects funded by the NSF ARC Arctic Social Sciences Program (ASSP) have specific requirements for data and metadata storage. The Arctic Data Center is a long-lived, publicly accessible archive that was created to help researchers meet the NSF data and metadata storage requirements.
With these requirements in mind, we strongly recommend that you plan to keep detailed and well-documented data and metadata records throughout your project. On our website, we have some guidelines on organizing your data: https://arcticdata.io/submit/#organizing-your-data. For those who are interested, we also offer regular training opportunities for hands-on help, as well as the self-guided curriculum, on this page: https://arcticdata.io/training/.

At this time, please note that NSF requires all metadata created during your project be submitted to a national data center or another long-lived, publicly accessible archive within two years of collection or before the end of your award, whichever comes first.
If you have any questions about the Arctic Data Center, NSF requirements, or need assistance submitting data and/or metadata, please reply to this email and we will respond as soon as possible.

Please note that NSF policies include special exceptions for awards that contain sensitive data, including human subjects data and data that are governed by an Institutional Review Board policy. In those cases, only metadata will be created with only the non-sensitive aspects of the study. More detailed information about what will be expected can be found at: https://arcticdata.io/submit/#who-must-submit.

With these requirements in mind, we strongly recommend that you plan to keep detailed and well-documented data and metadata records throughout your project. On our website we have some guidelines on organizing your data: https://arcticdata.io/submit/#organizing-your-data. For those who are interested, we also post regular training opportunities for hands on help on this page: https://arcticdata.io/training/.

If your research will involve large data submissions (a total size of > 100 GB or more than 500 files), please get in touch with us so we can create a plan together for those datasets.

We also wanted to highlight the data portals feature at the Arctic Data Center. Portals are dedicated sites that allow individuals and teams to create a collection of related datasets. They facilitate discovery, have collection specific metrics, can include custom search features and can be branded to reflect your organization of collaboration. For more information, examples and tutorials on how to get started visit our data portals page at:https://arcticdata.io/data-portals/.

If you have any questions about the Arctic Data Center, NSF requirements, or need assistance submitting data and/or metadata please reply to this email and we will respond as soon as possible.

Good luck on your research!
Best wishes with your research!

The Arctic Data Center Support Team
34 changes: 34 additions & 0 deletions man/main.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

41 changes: 41 additions & 0 deletions man/test_main.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

0 comments on commit 8c06358

Please sign in to comment.