Skip to content

Commit

Permalink
1.4.87
Browse files Browse the repository at this point in the history
  • Loading branch information
tmaeno committed Sep 6, 2021
1 parent c594c4e commit 9634c74
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 3 deletions.
5 changes: 4 additions & 1 deletion ChangeLog.txt
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
** Release Notes

1.4.87
* improved doc of pbook.retry

1.4.86
* pflow -> pchain
* server-side check for pchain
* moved the check function of pchain to server-side

1.4.85
* making pandatools symlink for backward-compatibility
Expand Down
8 changes: 7 additions & 1 deletion pandaclient/PBookScript.py
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,13 @@ def finish(taskIDs, soft=False):
# retry
def retry(taskIDs, newOpts=None):
"""
Retry failed/cancelled subJobs in taskIDs (ID or a list of ID, can be either jediTaskID or reqID). This means that you need to have the same runtime env (such as Athena version, run dir, source files) as the previous submission. One can use newOpts which is a map of options and new arguments like {'nFilesPerJob':10,'excludedSite':'ABC,XYZ'} to overwrite task parameters. The list of changeable parameters is site,excludedSite,includedSite,nFilesPerJob,nGBPerJob,nFiles,nEvents. If input files were used or are being used by other jobs for the same output dataset container, those file are skipped to avoid job duplication when retrying failed subjobs.
Retry failed/cancelled subJobs in taskIDs (ID or a list of ID, can be either jediTaskID or reqID).
This means that you need to have the same runtime env (such as Athena version, run dir, source files)
as the previous submission. One can use newOpts which is a map of options and new arguments like
{'nFilesPerJob':10,'excludedSite':'ABC,XYZ'} to overwrite task parameters. The list of changeable
parameters is site, excludedSite, includedSite, nFilesPerJob, nGBPerJob, nFiles, nEvents, loopingCheck,
nMaxFilesPerJob, ramCount. If input files were used or are being used by other jobs for the same output
dataset container, those file are skipped to avoid job duplication when retrying failed subjobs.
example:
>>> retry(123)
Expand Down
2 changes: 1 addition & 1 deletion pandaclient/PandaToolsPkgInfo.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
release_version = "1.4.86"
release_version = "1.4.87"

0 comments on commit 9634c74

Please sign in to comment.