Skip to content

[v5.0.x] fix: Set accelerator rcache flag in btl/ofi for accelerator memory #13359

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 18, 2025

Conversation

a-szegel
Copy link
Member

(cherry picked from commit 19a5405)

@github-actions github-actions bot added this to the v5.0.8 milestone Aug 11, 2025
@jsquyres jsquyres requested a review from Copilot August 18, 2025 20:19
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes an issue in the Open MPI BTL/OFI module where accelerator memory registration was not properly flagged. The change ensures that when registering memory through the BTL/OFI interface, accelerator memory (such as GPU memory) is correctly identified and marked with the appropriate cache flags.

  • Adds accelerator memory detection during memory registration in BTL/OFI module
  • Sets the MCA_RCACHE_FLAGS_ACCELERATOR_MEM flag when accelerator memory is detected

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@jsquyres
Copy link
Member

Pull Request Overview

This PR fixes an issue in the Open MPI BTL/OFI module where accelerator memory registration was not properly flagged. The change ensures that when registering memory through the BTL/OFI interface, accelerator memory (such as GPU memory) is correctly identified and marked with the appropriate cache flags.

Sorry -- I just wanted to try adding a copilot code review to see what would happen. This PR was my random guinea pig. 😄

@janjust janjust merged commit 44e6f09 into open-mpi:v5.0.x Aug 18, 2025
15 of 16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants