Hi,

> Here is a link [1] to you saying "I’m a strong believer in getting it to
> work first, then optimise it, premature optimisation wastes time and often
> makes incorrect assumptions.”

That’s exactly what I’m doing. I has something that works and I want to improve 
the performance. 

I’d say making something working 5x as fast, with no other costs, is hardly 
premature optimisation. You’re milage of cause may vary.

> Manually trying to determine whether to use "==" or "===" might be
> better left to tooling someday, so folks don't have to worry about it and
> can just get their features to work first.

Folk will run into this issue right away. Most JS devs use == and !== and !== 
instead often evil twins != and ==. 

The current behaviour will confuse people as code that works on JS will work 
differently in AS and vice versa.

> I'd personally lean towards warning when folks use JS patterns that can cause 
> trouble.

What patterns in particular are you referring to here? Sonar cube is waring 
about issues in our AS code not JS code.

> For uninitialized references to instances, maybe
> leaving it uninitialized and using "== null" is best.

Unlikely as it’s between 2x and 10x as slow and using == null as opposed to === 
null can have unexpected results.

> What is the point of initializing properties if there is a guarantee that
> they'll be initialized in other code before it is ever tested.  Code like:
> 
> var foo:String;
> if (someCondition)
> foo = "bar";
> else
> foo = "foo”;

Could be optimised away by the compiler.

> Ints are initialized to zero because the common patterns fail if you
> don't, not just because of "==" tests.

And I think we have the same issue with Booleans and perhaps Numbers.

> But there might be other ways to end up with
> better code than manually scrubbing the source.

Like what? Please provide data an/or examples and/or facts over option please.

Thanks,
Justin

Reply via email to