-
Notifications
You must be signed in to change notification settings - Fork 12.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DataLayout: Fix latent issues with getMaxIndexSizeInBits #118740
Conversation
@llvm/pr-subscribers-llvm-ir Author: Owen Anderson (resistor) ChangesBecause it was implemented in terms of getMaxIndexSize, it was always Since getMaxIndexSize was only used by getMaxIndexSizeInBits, and its Full diff: https://github.com/llvm/llvm-project/pull/118740.diff 2 Files Affected:
diff --git a/llvm/include/llvm/IR/DataLayout.h b/llvm/include/llvm/IR/DataLayout.h
index 93bd519f5727d8..5e6a6198a94733 100644
--- a/llvm/include/llvm/IR/DataLayout.h
+++ b/llvm/include/llvm/IR/DataLayout.h
@@ -330,9 +330,6 @@ class DataLayout {
/// the backends/clients are updated.
unsigned getPointerSize(unsigned AS = 0) const;
- /// Returns the maximum index size over all address spaces.
- unsigned getMaxIndexSize() const;
-
// Index size in bytes used for address calculation,
/// rounded up to a whole number of bytes.
unsigned getIndexSize(unsigned AS) const;
@@ -369,9 +366,7 @@ class DataLayout {
}
/// Returns the maximum index size over all address spaces.
- unsigned getMaxIndexSizeInBits() const {
- return getMaxIndexSize() * 8;
- }
+ unsigned getMaxIndexSizeInBits() const;
/// Size in bits of index used for address calculation in getelementptr.
unsigned getIndexSizeInBits(unsigned AS) const {
diff --git a/llvm/lib/IR/DataLayout.cpp b/llvm/lib/IR/DataLayout.cpp
index a4af0ead07cf61..5a110ed2b04058 100644
--- a/llvm/lib/IR/DataLayout.cpp
+++ b/llvm/lib/IR/DataLayout.cpp
@@ -740,12 +740,10 @@ unsigned DataLayout::getPointerSize(unsigned AS) const {
return divideCeil(getPointerSpec(AS).BitWidth, 8);
}
-unsigned DataLayout::getMaxIndexSize() const {
+unsigned DataLayout::getMaxIndexSizeInBits() const {
unsigned MaxIndexSize = 0;
for (const PointerSpec &Spec : PointerSpecs)
- MaxIndexSize =
- std::max(MaxIndexSize, (unsigned)divideCeil(Spec.BitWidth, 8));
-
+ MaxIndexSize =std::max(MaxIndexSize, Spec.IndexBitWidth);
return MaxIndexSize;
}
|
✅ With the latest revision this PR passed the C/C++ code formatter. |
Because it was implemented in terms of getMaxIndexSize, it was always rounding the values up to a multiple of 8. Additionally, it was using the PointerSpec's BitWidth rather than its IndexBitWidth, which was self-evidently incorrect. Since getMaxIndexSize was only used by getMaxIndexSizeInBits, and its name and function seem niche and somewhat confusing, go ahead and remove it until a concrete need for it arises.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Tagging @jrtc27 and @arichardson for awareness on CHERI-derived PRs |
Are there no tests that could expose the difference in behaviour? |
Use of index-size vs pointer-size seems to still be in a transitional state, as I've encountered quite a few places where we should be using the former but are still using the latter. As we fix those, the tests for them will also form tests for this, such as the test in #119138 |
Because it was implemented in terms of getMaxIndexSize, it was always rounding the values up to a multiple of 8. Additionally, it was using the PointerSpec's BitWidth rather than its IndexBitWidth, which was self-evidently incorrect. Since getMaxIndexSize was only used by getMaxIndexSizeInBits, and its name and function seem niche and somewhat confusing, go ahead and remove it until a concrete need for it arises.
Because it was implemented in terms of getMaxIndexSize, it was always rounding the values up to a multiple of 8. Additionally, it was using the PointerSpec's BitWidth rather than its IndexBitWidth, which was self-evidently incorrect. Since getMaxIndexSize was only used by getMaxIndexSizeInBits, and its name and function seem niche and somewhat confusing, go ahead and remove it until a concrete need for it arises.
Because it was implemented in terms of getMaxIndexSize, it was always rounding the values up to a multiple of 8. Additionally, it was using the PointerSpec's BitWidth rather than its IndexBitWidth, which was self-evidently incorrect. Since getMaxIndexSize was only used by getMaxIndexSizeInBits, and its name and function seem niche and somewhat confusing, go ahead and remove it until a concrete need for it arises.
Because it was implemented in terms of getMaxIndexSize, it was always
rounding the values up to a multiple of 8. Additionally, it was using
the PointerSpec's BitWidth rather than its IndexBitWidth, which was
self-evidently incorrect.
Since getMaxIndexSize was only used by getMaxIndexSizeInBits, and its
name and function seem niche and somewhat confusing, go ahead and remove
it until a concrete need for it arises.