Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3.6.8.29 #102

Merged
merged 28 commits into from
Sep 27, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
f9d274d
Merge pull request #3 from PalNilsson/next
PalNilsson Sep 23, 2021
23e0281
Merge pull request #7 from PalNilsson/next
PalNilsson Sep 20, 2022
3dd434f
New version
Sep 18, 2023
05c71a0
Merge pull request #9 from PalNilsson/next
PalNilsson Sep 22, 2023
947bdb0
Using psutil to get prmon pid. Added timeout to subprocess.wait(). Ad…
Sep 22, 2023
49a7118
Added extra parameter to send info to job metrics functions if necess…
PalNilsson Sep 22, 2023
2e6bbce
Added readbyterate to job metrics
Sep 24, 2023
2de107b
Added doc
Sep 24, 2023
ffd5026
Flake8
PalNilsson Sep 25, 2023
5a9124b
Added execute2() support for streaming command output to files
PalNilsson Sep 25, 2023
f712a5f
Added proper time-out handling and reading of curl output
Sep 25, 2023
6b05a0b
Added protection against expired job objects in job_monitor
Sep 25, 2023
8ab8296
Added workdir
Sep 25, 2023
dcfecbc
Increased --connect-timeout from 20s to 100s to be in line with panda…
PalNilsson Sep 26, 2023
a5e28b3
Updated python version for flake8 to 3.9.18
PalNilsson Sep 26, 2023
1ea6801
Create flake8-workflow.yml
PalNilsson Sep 26, 2023
e758dc8
Added flake8 workflow
PalNilsson Sep 26, 2023
4797e6b
Updated flake8 version
PalNilsson Sep 26, 2023
0957a78
Merge branch 'master' into next
PalNilsson Sep 26, 2023
432b181
Flake8 corrections
PalNilsson Sep 26, 2023
f010055
Merge remote-tracking branch 'origin/next' into next
PalNilsson Sep 26, 2023
f4d2bbe
Cleanup up unit tests workflow (remove flake8)
PalNilsson Sep 26, 2023
84aa8f5
Cleanup
PalNilsson Sep 26, 2023
3be5a37
Now reporting failed rucio trace curls with job metrics
PalNilsson Sep 26, 2023
487c65c
Interception of SIGSTOP (no action)
Sep 26, 2023
012acea
Removed useless interception of SIGSTOP
Sep 26, 2023
1287a69
Fallback to ps
PalNilsson Sep 27, 2023
492a5ba
Refactoring. Storing trace curl error in file, discovered by job metrics
PalNilsson Sep 27, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions .github/workflows/flake8-workflow.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: flake8-workflow

on:
push:
branches: [ "master", "next" ]
pull_request:
branches: [ "master", "next" ]

jobs:
build:

runs-on: ubuntu-latest
continue-on-error: true
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
env:
FLAKE8_VERSION: "==6.1.0"
FLAKE8_CONFIG: ".flake8"
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
architecture: x64
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install "flake8${{ env.FLAKE8_VERSION }}" 'pep8-naming' 'flake8-blind-except'
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Setup env
run: |
pwd
ls -lah
pip freeze
flake8 --version
if [[ ${{ env.FLAKE8_CONFIG }} != ".flake8" ]]; then rm .flake8; fi
- name: Flake8
run: flake8 --config ${{ env.FLAKE8_CONFIG}} pilot.py pilot/
25 changes: 0 additions & 25 deletions .github/workflows/unit-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,40 +12,15 @@ jobs:
strategy:
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11']
env:
FLAKE8_VERSION: "==3.8.4"
FLAKE8_CONFIG: ".flake8"
steps:
- name: Checkout Pilot3 repo
uses: actions/checkout@v3

# - name: Hack me some python
# run: |
# Hack to get setup-python to work on act
#if [ ! -f "/etc/lsb-release" ] ; then
# echo "DISTRIB_RELEASE=18.04" > /etc/lsb-release
# fi

- name: Setup Python3
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
architecture: x64

- name: Pip install
run: pip install "flake8${{ env.FLAKE8_VERSION }}" 'pep8-naming' 'flake8-blind-except'

- name: Setup env
run: |
pwd
ls -lah
pip freeze
flake8 --version
if [[ ${{ env.FLAKE8_CONFIG }} != ".flake8" ]]; then rm .flake8; fi

- name: Run flake8
run: flake8 --config ${{ env.FLAKE8_CONFIG}} pilot.py pilot/

- name: Run unit tests
run: python -m unittest

2 changes: 1 addition & 1 deletion PILOTVERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
3.6.7.10
3.6.8.29
6 changes: 4 additions & 2 deletions pilot/api/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@ class StagingClient(object):
"""

ipv = "IPv6"
workdir = ''
mode = "" # stage-in/out, set by the inheritor of the class
copytool_modules = {'rucio': {'module_name': 'rucio'},
'gfal': {'module_name': 'gfal'},
Expand All @@ -69,7 +70,7 @@ class StagingClient(object):
# list of allowed schemas to be used for transfers from REMOTE sites
remoteinput_allowed_schemas = ['root', 'gsiftp', 'dcap', 'srm', 'storm', 'https']

def __init__(self, infosys_instance=None, acopytools=None, logger=None, default_copytools='rucio', trace_report=None, ipv='IPv6'):
def __init__(self, infosys_instance=None, acopytools=None, logger=None, default_copytools='rucio', trace_report=None, ipv='IPv6', workdir=''):
"""
If `acopytools` is not specified then it will be automatically resolved via infosys. In this case `infosys` requires initialization.
:param acopytools: dict of copytool names per activity to be used for transfers. Accepts also list of names or string value without activity passed.
Expand All @@ -87,6 +88,7 @@ def __init__(self, infosys_instance=None, acopytools=None, logger=None, default_
self.logger = logger
self.infosys = infosys_instance or infosys
self.ipv = ipv
self.workdir = workdir

if isinstance(acopytools, str):
acopytools = {'default': [acopytools]} if acopytools else {}
Expand All @@ -103,7 +105,7 @@ def __init__(self, infosys_instance=None, acopytools=None, logger=None, default_
self.acopytools['default'] = self.get_default_copytools(default_copytools)

# get an initialized trace report (has to be updated for get/put if not defined before)
self.trace_report = trace_report if trace_report else TraceReport(pq=os.environ.get('PILOT_SITENAME', ''), ipv=self.ipv)
self.trace_report = trace_report if trace_report else TraceReport(pq=os.environ.get('PILOT_SITENAME', ''), ipv=self.ipv, workdir=self.workdir)

if not self.acopytools:
msg = 'failed to initilize StagingClient: no acopytools options found, acopytools=%s' % self.acopytools
Expand Down
6 changes: 3 additions & 3 deletions pilot/control/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,7 @@ def create_trace_report(job, label='stage-in'):

event_type, localsite, remotesite = get_trace_report_variables(job, label=label)
trace_report = TraceReport(pq=os.environ.get('PILOT_SITENAME', ''), localSite=localsite, remoteSite=remotesite,
dataset="", eventType=event_type)
dataset="", eventType=event_type, workdir=job.workdir)
trace_report.init(job)

return trace_report
Expand Down Expand Up @@ -239,7 +239,7 @@ def _stage_in(args, job):
client = StageInESClient(job.infosys, logger=logger, trace_report=trace_report)
activity = 'es_events_read'
else:
client = StageInClient(job.infosys, logger=logger, trace_report=trace_report, ipv=args.internet_protocol_version)
client = StageInClient(job.infosys, logger=logger, trace_report=trace_report, ipv=args.internet_protocol_version, workdir=job.workdir)
activity = 'pr'
use_pcache = job.infosys.queuedata.use_pcache
# get the proper input file destination (normally job.workdir unless stager workflow)
Expand Down Expand Up @@ -950,7 +950,7 @@ def _do_stageout(job, xdata, activity, queue, title, output_dir='', rucio_host='
# create the trace report
trace_report = create_trace_report(job, label=label)

client = StageOutClient(job.infosys, logger=logger, trace_report=trace_report, ipv=ipv)
client = StageOutClient(job.infosys, logger=logger, trace_report=trace_report, ipv=ipv, workdir=job.workdir)
kwargs = dict(workdir=job.workdir, cwd=job.workdir, usecontainer=False, job=job, output_dir=output_dir,
catchall=job.infosys.queuedata.catchall, rucio_host=rucio_host) #, mode='stage-out')
# prod analy unification: use destination preferences from PanDA server for unified queues
Expand Down
Loading