Confusion about: ASTContext::toCharUnitsFromBits usage

Hi all,

ASTContext::toCharUnitsFromBits is implemented as follows in clang/lib/AST/ASTContext.cpp [1]:

Hi all,

ASTContext::toCharUnitsFromBits is implemented as follows in clang/lib/AST/ASTContext.cpp [1]:


/// toCharUnitsFromBits - Convert a size in bits to a size in characters.
CharUnits ASTContext::toCharUnitsFromBits(int64_t BitSize) const {
return CharUnits::fromQuantity(BitSize / getCharWidth());
}

Based on the comment, I would assume that the goal is to convert the BitSize into a number of characters
that can contain the number of bits.
The implementation though is rounding downwards, to the number of characters fitting in the BitSize.

On a number of places [2], the function is called with a guarantee that the BitSize is a multiple of the CharWidth.
On a recent change though [3], the patch depends on the downwards rounding behavior.

Is this downwards rounding behavior the expected behavior for this function ?

We probably shouldn’t be relying on that. Fixed in r364139.