Martin, I was studying the PDF (and the sources that Google posted to be
used as examples), I've reached no conclusion so far, but things like the
below make me question the point of the article.

I've started with looking at Java, and here's something I find funny:

  public BasicBlock createNode(int name) {
    BasicBlock node;
    if (!basicBlockMap.containsKey(name)) {
      node = new BasicBlock(name);
      basicBlockMap.put(name, node);
    } else {
      node = basicBlockMap.get(name);
    }

    if (getNumNodes() == 1) {
      startNode = node;
    }

    return node;
  }
. . .
  public int getNumNodes() {
    return basicBlockMap.size();
  }

Maybe this is how they do it in Google, but that's really silly... I mean,
the last condition - it would be obviously if (startNode == null) in any
case, even if you wanted to know the size of the map, you just called that
same map.put() several lines above - why can't you call map.size() here?
This function is certainly called more then once, so it's an essential part
of the test.
Especially, when looking at Java code you see how much of corporative
coding style has bad influence on performance - set/get pairs, that do
nothing but set/get a data member - a total waste of time and lines of
code, but can you imagine a Java programmer who'd not do it like that? This
kind of thing is what bothers me most when Java is being considered. Not
the language, but a widely accepted tradition to do things the wrong way.

My other problem with JVM now is that it takes too long to launch. This
becomes a non-issue when we are talking about compilation of hundreds of
classes, but let's not forget that the SDK was previously used a lot to
work on small projects just the same. Often times for me it took more time
to launch JVM, then to compile the project.

The crossplatformness of JVM is similar to the crossplatformness of Flash
:) Funny thing - at some point I wrote a shell script that would delete
once a day JVM error logs, which it saved in ~/, every log separately -
those were mostly memory segmentation faults caused by using MXMLC.

Yet another issue with Java / JVM is that it disregards the accepted
practices of the system it runs on. I.e. Java doesn't compile executables,
or you would have to embed the whole runtime in each one of them. So,
normally you would add up more jars / class files and run them through the
runtime you specified. Now, imagine the situation, when you have more then
a single version of the product, and you want to run pieces from one
version with pieces from another version (compile AIR application with
another version of SDK that it was originally shipped with, anyone?) - at
that point you start having a lot of problems. The tools that didn't update
with your project suddenly stop working because they target a different JVM
and so on. This is so because the natural for Java way of programs
interacting with each other is by loading everything into the same runtime,
instead of communicating through some serialized gateway, like that
provided by command line interface - pipes, streams etc. This has a
tendency to end up in huge projects, which you never utilize at even a 10%
rate, but you must launch all of the machine to use what you need.

Yet another issue is that in real-world projects, whatever was written in
Java required more then half of the developer's time to be spent on
navigating through the heaps of configuration XML files, inevitably failing
to properly configure the application.
Whatever has some scent of "dynamic" is not working in Java, it causes
infinite headaches and hours of trying to figure out what has actually
happened. I also am familiar with an attitude, typical of long-time Java
programmers, who believe that things should always be like this, and that
the configuration hell is inevitable in a less trivial project - nope,
that's not true.

Reply via email to