On 2014-03-31 14:49, Michael Zedeler. wrote:
On 2014-03-29 21:45, Damian Conway wrote:
Moritz wrote:
To spin the tale further, we need to think about what happens if
somebody writes
multi foo(1|2e0) { ... }
so now we have Int|Num. We could explore the most-derived common
ancestor (Cool), or look into role space (Real, Numeric come to mind),
or simply error out.
Or maybe we need to reconsider the whole idea that it's appropriate to
infer type from a smartmatched constraint?
[...]
In other words specifying a constraint value is a way of applying a
smartmatched acceptance test to a parameter, but the type of the
acceptance test is typically totally unrelated to the type of the
parameter.
Which is why it now seems very odd to me that we are currently inferring
parameter types from constraint values.
I couldn't agree more. This looks like a piece of odd-sized baggage
left behind by Moose, where declaring type constraints on attributes
is too easily mistaken as actual type declarations.
Sorry - correction: "where declaring type constraints" should be "where
declaring value constraints".
--
Michael Zedeler
70 25 19 99
mich...@zedeler.dk <mailto:mich...@zedeler.dk>
dk.linkedin.com/in/mzedeler <http://dk.linkedin.com/in/mzedeler/> |
twitter.com/mzedeler <https://twitter.com/mzedeler> |
github.com/mzedeler <https://github.com/mzedeler/>