MarcusJohnson91 marked 5 inline comments as done.
MarcusJohnson91 added inline comments.


================
Comment at: clang/lib/AST/Expr.cpp:1091
+      if (llvm::convertUTF32ToUTF8String(AR, Output)) {
+        CString = new char[Output.size() + 1];
+        memcpy(CString, Output.c_str(), Output.size());
----------------
efriedma wrote:
> This leaks memory.
> 
> This function should return either a StringRef to memory that's part of AST 
> object, or an std::string.
I've switched over to using getStringAsChar and doing the conversion there 
instead of in getStrDataAsChar, on the other patch. (just going through this 
review to see if I missed any feedback)


================
Comment at: clang/lib/AST/Type.cpp:1962
 
+bool Type::isType(const std::string TypeName) const {
+  QualType Desugar = this->getLocallyUnqualifiedSingleStepDesugaredType();
----------------
efriedma wrote:
> MarcusJohnson91 wrote:
> > aaron.ballman wrote:
> > > Oh, I see now that this is doing a name comparison against the type -- 
> > > that's not a good API in general because it's *really* hard to guess at 
> > > what the type will come out as textually. e.g., `class` and `struct` 
> > > keywords are interchangeable in C++, C sometimes gets confused with 
> > > `bool` vs `_Bool`, template arguments sometimes matter, nested name 
> > > specifiers, etc. Callers of this API will have to guess at these details 
> > > and the printing of the type may change over time (e.g., C may switch 
> > > from `_Bool` to `bool` and then code calling `isType("_Bool")` may react 
> > > poorly to the change).
> > > 
> > > I think we need to avoid this sort of API on `Type`.
> > I see your point, I reverted the behavior back to doing the desugaring in 
> > just isChar16Type and isChar32Type
> I'm not convinced we should be looking at sugar even in 
> isChar16Type/isChar32Type/isAnyCharacterType.  That seems like a great way to 
> end up with subtle bugs that only manifest when someone uses the wrong 
> typedef.
> 
> Where is the distinction between the value `(uint32_t)1` vs. `(char32_t)1` 
> relevant for C, anyway?
char32_t is a typedef, not a builtin type in C.

the underlying type is uint_least32_t, which is usually another typedef to int.

in order for char32_t to be accepted in C mode, we have to know that it is a 
string type and not just some random array, so I'm checking the sugar to see if 
char32_t appears in the typedef chain.




CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D103426/new/

https://reviews.llvm.org/D103426

_______________________________________________
cfe-commits mailing list
cfe-commits@lists.llvm.org
https://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to