I don't go back to the beginning of FORTRAN. My comment was not that FORTRAN was badly named when it was among the first to do such things. I am saying that in retrospect, almost any language can do a basic subset of arithmetic operations. And there is nothing in principle that necessarily stops any modern language code from being optimized by translators to be even more rapid than the original versions of FORTRAN. If anything, I can well imagine algorithms using parallel architectures from performing some operations way faster. True, many languages tend to add overhead but that is not necessarily required.
As some have said here, many things these days are fast enough not to need ultimate optimization. Having said that, there is no reason why code used over and over should not be optimized. Functions in higher-level languages can be written using the language and then can be replaced if there is a great enough improvement and they do not need some of the interactive features the language might offer. Someone mentioned that in principle all data types can be stored in a string. Python does have a concept of converting many, but not all data structures into a byte form that can be written to files and restored or sent to other processes including on other machines. Of course entire programs written in text can be shipped this way to some extent. But when your data is a long list of real numbers that you want to find the standard deviation of, then converting it into a compact C/C++ array (or something similar in Fortran) and calling a function in that language that works fast on that, may be a good way to go. Why interpret a step at a time when one pass generates the data that can be processed in tighter loops and a result returned? -----Original Message----- From: Python-list <python-list-bounces+avigross=verizon....@python.org> On Behalf Of Dennis Lee Bieber Sent: Friday, January 4, 2019 1:17 PM To: python-list@python.org Subject: Re: the python name On Fri, 4 Jan 2019 11:34:24 -0500, "Avi Gross" <avigr...@verizon.net> declaimed the following: > >Although I used FORTRAN ages ago and it still seems to be in active use, I am not clear on why the name FORMULA TRANSLATOR was chosen. I do agree it does sound more like a computer language based on both the sound and feel of FORTRAN as well as the expanded version. > >It seems to have been designed as a mathematical extension of sorts that allowed you to evaluate a mathematical formula efficiently. I mean things like quadratic equations. But there is overlap with what other languages like COBOL or BASIC did at the time. > FORTRAN predates BASIC by a decade. COBOL was never meant to be an in-depth number cruncher language (original data type is packed BCD). Writing a quadratic equation in it probably takes two or three pages (as I recall, the COMPUTE verb was a later addition, so everything would have been long sentences: DIVIDE C BY B GIVING TMP1. MULTIPLY TMP1 BY 2. SUBTRACT TMP1 FROM A GIVING RESULT1. ADD TMP1 TO A GIVING RESULT2. vs COMPUTE TMP1 = (C / B) * 2. COMPUTE RESULT1 = A - TMP1. COMPUTE RESULT2 = A + TMP1. ) >What gets me is the vagueness of the words looked at by ME today. Any modern computing language can do what standard FORTRAN does, albeit perhaps more slowly as I know some languages do some of their math using libraries from FORTRAN. But do we use the word TRANSLATOR quite that way much anymore? Heck, do we use FORMULA in the same way? Meanings change... "COMPUTER" means "one who computes" -- those poor overworked engineers with slide-rules creating ballistic tables for battleships. "Translator" still applies -- in the sense of taking one language (source) and producing the equivalent meaning in another language (assembler, and then translating that to pure machine binary). "Formula" has likely been superceded by "algorithm" (cf ALGOL) > >My most recent use of formula has been in the R language where there is a distinct object type called a formula that can be used to specify models when doing things like a regression on data. I am more likely to call the other kind using words like "equation". Python has an add-on that does symbolic manipulation. Did FORTRAN have any of these enhanced objects back when created, or even now? No body had symbolic manipulation when FORTRAN was created. At the time, the goal was to produce a higher order language that the scientists could write without having to /know/ the computer assembly/machine code, with the hope that it could be portable at the source level. Users were charged by the minute for CPU time, and operating systems were lucky to be able to handle more than one program in parallel. Batch systems were the norm -- where a program would run until it either finished, or it ran out of CPU time (as specified on a LIMIT in the job control deck). One would turn in a deck of cards to be spooled in the job queue, and come back some hours later to get the printout from the job. -- Wulfraed Dennis Lee Bieber AF6VN wlfr...@ix.netcom.com HTTP://wlfraed.home.netcom.com/ -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list