Recently I started seeing a build error from a tree that has lldb in it;
I don't know whether the problem is my configuration, or Ubuntu, or gcc, 
or what, but gcc complains that it can't convert 'int' to 'sigset_t' on 
the return statement.

This naïve one-liner fixes it, although I don't know anything about
signal stuff.  Didn't seem worth cranking up a Phab review for this...
--paulr

Index: source/Host/common/MainLoop.cpp
===================================================================
--- source/Host/common/MainLoop.cpp     (revision 301939)
+++ source/Host/common/MainLoop.cpp     (working copy)
@@ -155,7 +155,7 @@
 
 sigset_t MainLoop::RunImpl::get_sigmask() {
 #if SIGNAL_POLLING_UNSUPPORTED
-  return 0;
+  return sigset_t();
 #else
   sigset_t sigmask;
   int ret = pthread_sigmask(SIG_SETMASK, nullptr, &sigmask);


_______________________________________________
lldb-commits mailing list
lldb-commits@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/lldb-commits

Reply via email to