On Thu, 4 Oct 2012, Kenneth Zadeck wrote:

On 10/04/2012 09:17 AM, Marc Glisse wrote:
On Wed, 3 Oct 2012, Mike Stump wrote:
On Oct 3, 2012, at 1:47 PM, Marc Glisse <marc.gli...@inria.fr> wrote:
did you consider making the size of wide_int a template parameter, now that we are using C++? All with a convenient typedef or macro so it doesn't show. I am asking because in vrp I do some arithmetic that requires 2*N+1 bits where N is the size of double_int.

No, not really. I'd maybe answer it this way, we put in a type (singular) to support all integral constants in all languages on a port. Since we only needed 1, there was little need to templatize it. By supporting all integral constants in all languages, there is little need for more. If Ada say, wanted a 2048 bit integer, then, we just have it drop off the size it wants someplace and we would mix that in on a MAX(….) line, net result, the type we use would then directly support the needs of Ada. If vpr wanted 2x of all existing modes, we could simply change the MAX equation and essentially double it; if people need that. This comes as a cost, as the intermediate wide values are fixed size allocated (not variable); so these all would be larger.

And this cost could be eliminated by having a template wide_int_ so only the places that need it actually use the extra size ;-)

The space is not really an issue in most places since wide-ints tend to be short lived.

You were the one talking of a cost.

However the real question is what are you going to instantiate the template on? What we do is look at the target and determine the largest type that the target supports and build a wide int type that supports that. how are you going to do better?

In a single place in tree-vrp.c in the code that evaluates multiplications, I would instantiate the template on the double (possibly +1) of the value you selected as large enough for all constants. For all the rest, your type is fine.

This will be for discussion when you submit that next patch, but currently VRP handles integers the same size as double_int. In particular, it handles __int128. I would be unhappy if introducing a larger bigint type in gcc made us regress there.

You are only happy now because you do not really understand the world around you.

I did not want to go into details, but let me re-phrase: I do not want to regress. Currently, hosts with a 64 bit hwi can handle VRP multiplications on __int128. If your patch introducing better big integers breaks that, that sounds bad to me, since I would expect s/double_int/wide_int/ to just work, and using wide_int<2*MAX> would just be a potential simplification of the code for later.


Note that VRP is just the one case I am familiar with. Using templates should (I haven't checked) be completely trivial and help the next person who needs bigger integers for a specific purpose and doesn't want to penalize the whole compiler. If the size of wide_int is completely irrelevant and we can make it 10 times larger without thinking, I guess some numbers showing it would be great (or maybe that's common knowledge, then I guess it is fine).


Now those are only some comments from an occasional contributor, not reviewer requirements, it is fine to ignore them.

--
Marc Glisse

Reply via email to