On 17/05/2017 17:23, Chris Angelico wrote:
On Thu, May 18, 2017 at 1:37 AM, bartc <b...@freeuk.com> wrote:
On 17/05/2017 15:13, Chris Angelico wrote:
[1] Does work on Windows. Install bash for Windows, or (on a
recent-enough Windows) just use the subsystem that Microsoft provides.
So the answer is, switch to Linux. In other words, a cop-out.
[2] So? You don't have to read it, just run it. Or do you read every
line of code that any of your programs executes?
Sometimes, if there's a problem. But usually the code is doing something
sensible. The stuff in configure is complete gobbledygook (if anyone
doesn't believe me, just have look).
It is impossible that all this is needed just to figure out what source
files need to be compiled. (If it generated CPython sources
fractal-style, then I might be impressed, but doesn't.)
I didn't even have an editor the first time I had that situation. And
it worked, because I was using the same C compiler that I was
accustomed to - it existed for that platform. I was specifically NOT
helpless because I am used to using good tools.
How about if I put you on a different CPU than you're used to? Can you
use your tiny C compiler? I doubt it, because it's emitting Intel byte
code.
I'm used to it. I used to start off with machines with no software in
them at all. (Actually, I often needed to wire up the boards first.)
The point I'm making is that these things should still work when reduced
to the basics.
(As it happens, I'm done this on an original raspberry pi machine. It
will need an existing C compiler to get started, after which my
programs, interpreters and compilers, run fine.
But it's funny waiting for gcc to compile my mcc64.c file (taking 30 to
120 seconds depending on optimisation), and then it compiles itself in
0.7 seconds. Of course that would be targetting x64 so the output is not
much use.
What it does show is that it is practical to reduce a substantial
application to a single file, and trivially build it pretty much
everywhere, as C compilers are ubiquitous. And that it will run with
little trouble. The same source can run on ARM32, x86 and x64; Linux or
Windows. Probably elsewhere but that is the extent of my tests.)
You're asking me to bootstrap Python. I would start by looking for the
nearest similar platform and trying to build a hybrid. I haven't done
this with Python itself, but a while ago, I wanted to port a
similarly-sized language to OS/2, and the process went like this:
1) Attempt to run the configure script, using bash and gcc (which
already existed for OS/2)
I said bash didn't exist.
(BTW I've gone down the route of installing MSYS etc many times,
something always went wrong. Why is it that people can't appreciate that
complicated thing are more likely to go wrong?)
You need hard information in the form of what exactly needs to be done
with the .c and .h files provided. If it's necessary that another
language is needed to generate some of those, then sources for that
program are needed too.
However, that would also suggest that the application (CPython etc) is
not a pure C application at all.
Yes. Yes, it is. For starters, GCC actually can compile to machine
code, instead of depending on nasm.
I don't think it does. gcc AFAIK generates assembly code (via pipes or
equivalent so usually you don't see it), then invokes an assembler
called 'as' to convert into object code.
I believe that the base gcc compiler is separate from the bin-utils such
as 'as' and 'ld' (the linker).
Downloading gcc tdm for Windows, these are separate download bundles. On
Linux, this stuff is pre-installed so it's easy to think it's all the
same thing.
When /my/ compiler does it properly, then it will be truly
self-contained. And still be one executable.
Then do it. Make your compiler able to target all of the above. See
how much ifdef mess you need.
My language doesn't have ifdef. There are other ways of managing this
switch. (Actually my current compiler for that language is interpreted.
It supports two targets.)
Plus, git shouldn't be considered an onerous requirement for a
developer. Go get it. Start using it.
I'm doing this for fun. It's not a job to be taken seriously. And doing
battle with other people's temperamental multi-GB applications is not my
idea of fun.
(I mean, even if I did so, what can I contribute? Nothing.)
And I can only conclude from your comments, that CPython is also incomplete
because it won't work without a bunch of other tools, all chosen to be the
biggest and most elaborate there are.
Exactly. The core devs reject small solutions, even if they're
perfect, in order to pick up a much larger and more elaborate one.
That's how they make technology decisions.
You're being sarcastic, but you may have a hit upon the truth.
I mean, is there any bigger development system they could have chosen
than VS 2015? (And someone mentioned that VS 2017 needed 25GB or something.
(We are still talking about building a byte-code compiler, and byte-code
interpreter, aren't we?)
You'll have to compare like for like, and on a serious benchmark. I'm
not going to believe "might be 30% slower" without some actual
numbers.
I've put some comments here: https://pastebin.com/hSCjNsA2 as it's
getting way off-topic.
If you were paying someone by the hour to add sqlite to a project, would you
rather they just used the amalgamated file, or would you prefer they spent
all afternoon ******* about with TCL trying to get it to generate those
sources?
I would expect them to use a precompiled binary, frankly. Why use the
amalgamated file when you can just use your system's package manager
to grab a ready-to-go version? Unless, of course, you're stuck on a
platform that doesn't even HAVE a package manager, in which case your
options are (a) get a package manager, and (b) grab a precompiled
binary off the web and use that.
Rhodri is right. Your naivety is so charming. You seem to genuinely
think that life can be that simple.
And some seem to genuinely think that it needs to be that complicated
just because there is plenty of memory to fill up.
I started computing 40 years ago: you couldn't have complicated because
the hardware wasn't up to it. Things still worked. Having more memory,
more storage and faster processors doesn't mean complexity has to
increase too.
--
bartc
--
https://mail.python.org/mailman/listinfo/python-list