Skip to content

Commit

Permalink
Merge branch 'master' into feature/multiprovider
Browse files Browse the repository at this point in the history
  • Loading branch information
kjacque committed Apr 16, 2024
2 parents 65e76b1 + c555ef0 commit 6fe90ac
Show file tree
Hide file tree
Showing 420 changed files with 67,867 additions and 2,585 deletions.
3 changes: 2 additions & 1 deletion .github/actions/provision-cluster/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,8 @@ runs:
run: |
. ci/gha_functions.sh
inst_repos="${{ env.CP_PR_REPOS }} ${{ github.event.inputs.pr-repos }}"
if [[ $inst_repos != *daos@* ]]; then
if [ -z "${{ env.CP_RPM_TEST_VERSION }}" ] &&
[[ $inst_repos != *daos@* ]]; then
inst_repos+=" daos@PR-${{ github.event.pull_request.number }}"
inst_repos+=":${{ github.run_number }}"
fi
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/bash_unit_testing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ defaults:
run:
shell: bash --noprofile --norc -ueo pipefail {0}

permissions: {}

jobs:
Test-gha-functions:
name: Tests in ci/gha_functions.sh
Expand Down
6 changes: 6 additions & 0 deletions .github/workflows/ci2.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,19 @@ concurrency:
group: ci2-${{ github.head_ref }}
cancel-in-progress: true

permissions: {}

jobs:

# reuse the cache from the landing-builds workflow if available, if not then build the images
# from scratch, but do not save them.
Build-and-test:
name: Run DAOS/NLT tests
runs-on: ubuntu-22.04
permissions:
# https://github.com/EnricoMi/publish-unit-test-result-action#permissions
checks: write
pull-requests: write
strategy:
matrix:
distro: [ubuntu]
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/clang-format.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ name: clang-format
on:
pull_request:

permissions: {}

jobs:
pylint:
name: Clang Format
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/create_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ on:
- master
- 'release/**'

permissions: {}

jobs:
make_release:
name: Create Release
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/doxygen.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ name: Doxygen
on:
pull_request:

permissions: {}

jobs:

Doxygen:
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/flake.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ name: Flake
on:
pull_request:

permissions: {}

jobs:
flake8-lint:
runs-on: ubuntu-22.04
Expand Down
6 changes: 6 additions & 0 deletions .github/workflows/landing-builds.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@ on:
- requirements-build.txt
- requirements-utest.txt

permissions: {}

jobs:

# Build a base Docker image, and save it with a key based on the hash of the dependencies, and a
Expand Down Expand Up @@ -86,6 +88,10 @@ jobs:
name: Run DAOS/NLT tests
needs: Prepare
runs-on: ubuntu-22.04
permissions:
# https://github.com/EnricoMi/publish-unit-test-result-action#permissions
checks: write
pull-requests: write
strategy:
matrix:
distro: [ubuntu]
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ on:
- 'release/*'
pull_request:

permissions: {}

jobs:
# Run isort on the tree.
# This checks .py files only so misses SConstruct and SConscript files are not checked, rather
Expand Down
76 changes: 76 additions & 0 deletions .github/workflows/ossf-scorecard.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.

name: Scorecard supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '45 8 * * 0'
push:
branches: ["master"]
pull_request:

# Declare default permissions as nothing.
permissions: {}

jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed to upload the results to code-scanning dashboard.
security-events: write
# Needed to publish results and get a badge (see publish_results below).
id-token: write
# Uncomment the permissions below if installing in a private repository.
# contents: read
# actions: read

steps:
- name: "Checkout code"
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
with:
persist-credentials: false

- name: "Run analysis"
uses: ossf/scorecard-action@0864cf19026789058feabb7e87baa5f140aac736 # v2.3.1
with:
results_file: results.sarif
results_format: sarif
# (Optional) "write" PAT token. Uncomment the `repo_token` line below if:
# - you want to enable the Branch-Protection check on a *public* repository, or
# - you are installing Scorecard on a *private* repository
# To create the PAT, follow the steps in
# https://github.com/ossf/scorecard-action?tab=readme-ov-file#authentication-with-fine-grained-pat-optional.
# repo_token: ${{ secrets.SCORECARD_TOKEN }}

# Public repositories:
# - Publish results to OpenSSF REST API for easy access by consumers
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
# For private repositories:
# - `publish_results` will always be set to `false`, regardless
# of the value entered here.
publish_results: true

# Upload the results as artifacts (optional). Commenting out will disable
# uploads of run results in SARIF
# format to the repository Actions tab.
- name: "Upload artifact"
uses: actions/upload-artifact@97a0fba1372883ab732affbe8f94b823f91727db # v3.pre.node20
with:
name: SARIF file
path: results.sarif
retention-days: 5

# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@1b1aada464948af03b950897e5eb522f92603cc2 # v3.24.9
with:
sarif_file: results.sarif
3 changes: 3 additions & 0 deletions .github/workflows/pr-metadata.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,13 @@ on:
pull_request_target:
types: [opened, synchronize, reopened, edited]

permissions: {}

jobs:
example_comment_pr:
runs-on: ubuntu-22.04
permissions:
pull-requests: write
name: Report Jira data to PR comment
steps:
- name: Checkout
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/pylint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ name: Pylint
on:
pull_request:

permissions: {}

jobs:
pylint:
name: Pylint check
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/rpm-build-and-test-report.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,14 @@ on:
# for testing before landing
workflow_dispatch:

permissions:
contents: read
actions: read
checks: write
permissions: {}

jobs:
report-vm-1:
runs-on: [self-hosted, docker]
# https://github.com/dorny/test-reporter/issues/149
permissions:
checks: write
strategy:
matrix:
# TODO: figure out how to determine this matrix
Expand Down
16 changes: 9 additions & 7 deletions .github/workflows/rpm-build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,7 @@ defaults:
run:
shell: bash --noprofile --norc -ueo pipefail {0}

# https://github.com/dorny/test-reporter/issues/149
permissions:
id-token: write
contents: read
checks: write
# https://github.com/EnricoMi/publish-unit-test-result-action#permissions
pull-requests: write
permissions: {}

jobs:
# it's a real shame that this step is even needed. push events have the commit message # in
Expand Down Expand Up @@ -363,6 +357,10 @@ jobs:
Functional:
name: Functional Testing
runs-on: [self-hosted, wolf]
permissions:
# https://github.com/EnricoMi/publish-unit-test-result-action#permissions
checks: write
pull-requests: write
timeout-minutes: 7200
needs: [Build-RPM, Import-commit-message, Calc-functional-matrix, Import-commit-pragmas]
strategy:
Expand Down Expand Up @@ -594,6 +592,10 @@ jobs:
Functional_Hardware:
name: Functional Testing on Hardware
runs-on: [self-hosted, wolf]
permissions:
# https://github.com/EnricoMi/publish-unit-test-result-action#permissions
checks: write
pull-requests: write
timeout-minutes: 7200
needs: [Import-commit-message, Build-RPM, Calc-functional-hardware-matrix,
Import-commit-pragmas, Functional]
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/spelling.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ name: Codespell
on:
pull_request:

permissions: {}

jobs:

Codespell:
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/version-checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ on:
paths:
- 'utils/cq/requirements.txt'

permissions: {}

jobs:
upgrade-check:
name: Check for updates
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/yaml.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ on:
- '**/*.yml'
- utils/cq/requirements.txt

permissions: {}

jobs:
yaml-lint:
runs-on: ubuntu-22.04
Expand Down
6 changes: 4 additions & 2 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,8 @@ void fixup_rpmlintrc() {
'/usr/bin/hello_drpc',
'/usr/bin/daos_firmware',
'/usr/bin/daos_admin',
'/usr/bin/daos_server']
'/usr/bin/daos_server',
'/usr/bin/ddb']

String content = readFile(file: 'utils/rpms/daos.rpmlintrc') + '\n\n' +
'# https://daosio.atlassian.net/browse/DAOS-11534\n'
Expand Down Expand Up @@ -1016,7 +1017,8 @@ pipeline {
post {
always {
discoverGitReferenceBuild referenceJob: 'daos-stack/daos/master',
scm: 'daos-stack/daos'
scm: 'daos-stack/daos',
requiredResult: hudson.model.Result.UNSTABLE
recordIssues enabledForFailure: true,
failOnError: false,
ignoreQualityGate: true,
Expand Down
2 changes: 2 additions & 0 deletions ci/codespell.ignores
Original file line number Diff line number Diff line change
Expand Up @@ -34,3 +34,5 @@ expres
signalling
laf
cacl
chk
falloc
6 changes: 6 additions & 0 deletions debian/changelog
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
daos (2.5.101-4) unstable; urgency=medium
[ Fan Yong ]
* NOOP change to keep in parity with RPM version

-- Fan Yong <[email protected]> Fri, 05 Apr 2024 09:30:00 +0900

daos (2.5.101-3) unstable; urgency=medium
[ Ashley M. Pittman ]
* Updated pydaos install process
Expand Down
1 change: 1 addition & 0 deletions debian/daos-server-tests.install
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ usr/bin/smd_ut
usr/bin/bio_ut
usr/bin/vea_ut
usr/bin/vos_tests
usr/bin/ddb_tests
usr/bin/vea_stress
usr/bin/vos_perf
usr/bin/obj_ctl
3 changes: 3 additions & 0 deletions debian/daos-server.install
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,10 @@ usr/bin/daos_server_helper
usr/bin/daos_server
usr/bin/daos_engine
usr/bin/daos_metrics
usr/bin/ddb
usr/lib64/daos_srv/libchk.so
usr/lib64/daos_srv/libcont.so
usr/lib64/daos_srv/libddb.so
usr/lib64/daos_srv/libdtx.so
usr/lib64/daos_srv/libmgmt.so
usr/lib64/daos_srv/libobj.so
Expand Down
7 changes: 5 additions & 2 deletions docs/QSG/setup_rhel.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,16 +127,19 @@ used by DAOS and NVME SSDs will be identified.
pmem0 0 3.2 TB
pmem1 0 3.2 TB

4. Scan the available storage on the Server nodes:
4. Scan the available nvme storage on the Server nodes:

daos_server storage scan
daos_server nvme scan
Scanning locally-attached storage\...

NVMe PCI Model FW Revision Socket ID Capacity
-------- ----- ----------- --------- --------
0000:81:00.0 INTEL SSDPE2KE016T8 VDV10170 0 1.6 TB
0000:83:00.0 INTEL SSDPE2KE016T8 VDV10170 1 1.6 TB

5. Scan the available scm storage on the Server nodes:

daos_server scm scan
SCM Namespace Socket ID Capacity
------------- --------- --------
pmem0 0 3.2 TB
Expand Down
7 changes: 5 additions & 2 deletions docs/QSG/setup_suse.md
Original file line number Diff line number Diff line change
Expand Up @@ -148,16 +148,19 @@ used by DAOS and NVME SSDs will be identified.
pmem0 0 3.2 TB
pmem1 0 3.2 TB

4. Scan the available storage on the Server nodes:
4. Scan the available nvme storage on the Server nodes:

daos_server storage scan
daos_server nvme scan
Scanning locally-attached storage\...

NVMe PCI Model FW Revision Socket ID Capacity
-------- ----- ----------- --------- --------
0000:81:00.0 INTEL SSDPE2KE016T8 VDV10170 0 1.6 TB
0000:83:00.0 INTEL SSDPE2KE016T8 VDV10170 1 1.6 TB

5. Scan the available scm storage on the Server nodes:

daos_server scm scan
SCM Namespace Socket ID Capacity
------------- --------- --------
pmem0 0 3.2 TB
Expand Down
Loading

0 comments on commit 6fe90ac

Please sign in to comment.