http://gcc.gnu.org/bugzilla/show_bug.cgi?id=60766

--- Comment #6 from Richard Biener <rguenth at gcc dot gnu.org> ---
Created attachment 32556
  --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=32556&action=edit
patch

The issue is that tree-ssa-loop-ivopts.c:cand_value_at converts niter
((unsigned int) n_3 * 2863311531 + 4294967294) to 'int' via

static void
cand_value_at (struct loop *loop, struct iv_cand *cand, gimple at, tree niter,
               aff_tree *val)
{
  aff_tree step, delta, nit;
  struct iv *iv = cand->iv;
  tree type = TREE_TYPE (iv->base);
  tree steptype = type;
  if (POINTER_TYPE_P (type))
    steptype = sizetype;

  tree_to_aff_combination (iv->step, steptype, &step);
  tree_to_aff_combination (niter, TREE_TYPE (niter), &nit);
  aff_combination_convert (&nit, steptype);
^^^

which just does

  comb->type = type;
  if (comb->rest && !POINTER_TYPE_P (type))
    comb->rest = fold_convert (type, comb->rest);

thus re-interprets everything as signed.  The whole aff_combination_convert
function looks suspicious ... but at this stage the easiest thing to do is
to avoid this 2nd call to this function (the other always converts to an
unsigned type).

Unfortunately doing that:

Index: gcc/tree-ssa-loop-ivopts.c
===================================================================
--- gcc/tree-ssa-loop-ivopts.c  (revision 209181)
+++ gcc/tree-ssa-loop-ivopts.c  (working copy)
@@ -4238,8 +4238,7 @@ cand_value_at (struct loop *loop, struct
     steptype = sizetype;

   tree_to_aff_combination (iv->step, steptype, &step);
-  tree_to_aff_combination (niter, TREE_TYPE (niter), &nit);
-  aff_combination_convert (&nit, steptype);
+  tree_to_aff_combination (fold_convert (steptype, niter), steptype, &nit);
   aff_combination_mult (&nit, &step, &delta);
   if (stmt_after_increment (loop, cand, at))
     aff_combination_add (&delta, &step);

reveals the other suspicious

void
tree_to_aff_combination (tree expr, tree type, aff_tree *comb)
{
  aff_tree tmp;
  enum tree_code code;
  tree cst, core, toffset;
  HOST_WIDE_INT bitpos, bitsize;
  enum machine_mode mode;
  int unsignedp, volatilep;

  STRIP_NOPS (expr);

which just re-introduces the exact same affine combination.

This is kind-of a mess.  Either the internal affine workings is
modulo two arithmetic and thus it doesn't need to care - but then
it needs to use unsigned arithmetic only at affine-to-tree time.
Or it depends on the sign of the affine combination but then it
has to be more careful.

IIRC it is the first, thus affine-to-tree is wrong in returning
signed arithmetic and keeping a "type" for the affine combination
doesn't make much sense (similar issue for pointer arithmetic btw,
where we choose a random "base").

But it's all kind of a mess.

Working, somewhat localized patch attached.

Reply via email to