Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Cache Coherency support on Ultrascale projects #1527

Merged
merged 1 commit into from
Mar 3, 2025
Merged

Conversation

podgori
Copy link
Contributor

@podgori podgori commented Nov 22, 2024

PR Description

This commit enables Cache Coherency on Ultrascale+ projects, leaving it disabled for the other platforms.

PR Type

  • Bug fix (change that fixes an issue)
  • New feature (change that adds new functionality)
  • Breaking change (has dependencies in other repos or will cause CI to fail)

PR Checklist

  • I have followed the code style guidelines
  • I have performed a self-review of changes
  • I have compiled all hdl projects and libraries affected by this PR
  • I have tested in hardware affected projects, at least on relevant boards
  • I have commented my code, at least hard-to-understand parts
  • I have signed off all commits from this PR
  • I have updated the documentation (wiki pages, ReadMe files, Copyright etc)
  • I have not introduced new Warnings/Critical Warnings on compilation
  • I have added new hdl testbenches or updated existing ones

Copy link
Contributor

@gastmaier gastmaier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe it is better to have set CACHE_COHERENCY false set on the common bds instead of having the check if the variable exists condition.
Also, I would revert adi_bd.tcl since ad_mem_hpc0_interconnect ad_mem_hpc1_interconnect methods already exist and the only project that leverages it can call the methods directly without further condition checks.

@gastmaier
Copy link
Contributor

Hi, can you solve the merge conflict, and establish if this is the final implementation?

@podgori
Copy link
Contributor Author

podgori commented Feb 13, 2025

V2: Reverted the initial changes on adi_board.tcl and updated the cache coherency enablement per common project as suggested.

@gastmaier
Copy link
Contributor

gastmaier commented Feb 18, 2025

From the ci run:

  • ad_quadmxfe1_ebz.vcu118 : Failed
  • adrv9009zu11eg.adrv2crr_xmicrowave : Failed

main run:

  • ad_quadmxfe1_ebz.vcu118 : Failed
  • adrv9009zu11eg.adrv2crr_xmicrowave : Ok

locally I get the same results, adrv9009zu11eg.adrv2crr_xmicrowave fails adrv2crr_fmcxmwbr1$ make ADI_PRODUCTION=1.
(ADI_PRODUCTION=0 succeeds)

pr:
job/main/job/builds/job/main_PR_1527/job/projects/job/ad_quadmxfe1_ebz.vcu118/ws/vivado.log
job/main/job/builds/job/main_PR_1527/job/projects/job/adrv9009zu11eg.adrv2crr_xmicrowave/ws/ADIPRODUCTION1/vivado.log

main latest commit:
job/main/job/builds/job/main_latest_commit/job/projects/job/ad_quadmxfe1_ebz.vcu118/ws/vivado.log
job/main/job/builds/job/main_latest_commit/job/projects/job/adrv9009zu11eg.adrv2crr_xmicrowave/ws/ADIPRODUCTION1/vivado.log

There is also adrv9009zu11eg.adrv2crr_fmcomms8 2 SUCCESS Failed but looks unrelated (network issue).

If it turns out to be unrelated, this should be merged

@podgori
Copy link
Contributor Author

podgori commented Mar 3, 2025

V3: Rebased on the latest changes on main; fixed adrv9009zu11eg.adrv2crr_xmicrowave timing failure.

Copy link
Contributor

@gastmaier gastmaier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for fixing timing.
The CI status is off because cn0506.zc706 stated failing is passing: /job/main/job/builds/job/main_PR_1527/job/projects/job/cn0506.zc706/

ad_quadmxfe1_ebz.vcu118 was already failling on main

@podgori podgori merged commit fa212ed into main Mar 3, 2025
3 of 5 checks passed
@podgori podgori deleted the dev_cc_ultrascale branch March 3, 2025 09:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants