Michael Ellis <michael.f.el...@gmail.com>:
> I'm finding that a good work flow involves commenting out all the
> functions in the output and fixing them up one at a time.

Agreed. That's exactly how  I'm doing the reposurgeon translation.
With the additional step that I'm trying to write unit tests for each
function as I go.

> Getting it to compile and run required:
> 
>    1. Removing the isinstance test on number. It's not needed in Go.
>    2. Deciding what to do about the number > 0 test. I made it return an 
>    error.
>    3. Choosing types for the arguments and returns.
>    4. Declaring i outside the encoding loop.
>    5. Testing number explicitly against 0.
>    6. Converting alphabet[i] to string
>    7. Discarding the final OR against alphabet[0]
>    8. Writing a divmod function

Right, that seems pretty typical of my experience too. I'm probably
writing less new code than you are because my Python is mostly
text-bashing, for which Python's and Go's facilities are
near-equivalent.

Type annotation is the big deal remaining, unsurprisingly.

> I'm not sure which, if any, of the above are feasible to automate.

I don't actually have grand plans that way.  What I've been doing so
far is, effectively, cliche-hunting - trying to discover well-defined
textual transformations that are pretty much guaranteed correct or at
least do no harm.  Targets of opportunity.

A good example is blindly translating the Python token "None" to Go "nil".
It's completely dumb, but almost always the right thing to do.  The cases
where it's wrong will get thrown back at you by the Go compiler.

I know from experience that if you pile up enough of these stupid
transforms it can look a great deal as though your translator is much
smarter than it actually is.  The conceptual ancestor of pygoto is my
"doclifter" code, which lifts groff/troff markup to structural XML via a
similar combination of cliche analysis with crude ad-hoc parsing that has
no AST-like representation in it anywhere. Noting "principled".

(In AI, production-rule systems work like this.  That's where I learned
that enough shallow pattern-driven inference rules can sometimes outdo
"deep" reasoning. Which is the exact insight doclifter and pytogo
apply.  In compilers, peephole optimizers embody the same approach.)

The most recent set of transformations mungs typical list- and map-
traversal idioms in Python to equivalent Go. I'm a bit dizzied by how
well these actually work - it's well beyond where I initially thought
I would get with crude regexp-bashing.

But at a metalevel, I sort of count on such surprises to accumulate
when I'm applying this strategy. It's like the military concept of
a swarm attack - a rule-swarm attack.

(Hm. I may need to blog about that term.)

> Which is not to say that pytogo isn't useful.  It provides a good
> rough draft and that's quite a lot.

Me, I think the dominating win is still automating scope closures,
a mind-numbing and defect-attracting task by hand.  The other stuff is nice
by pytogo could justify its existence by that alone.
 
> My only immediate suggestion would be to consider yanking function doc 
> strings from the body and outputting them as comments above the func line.

See, now, that's a perfect candidate for inclusion in the rule swarm.
Almost always correct. Does strictly bounded, easily-undone harm in the
rare cases where it isn't.
-- 
                <a href="http://www.catb.org/~esr/";>Eric S. Raymond</a>

My work is funded by the Internet Civil Engineering Institute: https://icei.org
Please visit their site and donate: the civilization you save might be your own.


-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to