On 15 Jun 2010, at 23:33, Connor Lane Smith wrote:

On 15 June 2010 14:05, Ethan Grammatikidis <eeke...@fastmail.fm> wrote:
On 15 Jun 2010, at 12:51, Connor Lane Smith wrote:
In my opinion the problem is purely user experience:

Is this your opinion, or lines you've been fed?

My own, strangely enough. UX is one of my interests, particularly
since most programmers seem to suck so hard at it. Let's at least
assume for the time being that suckless devs have minds of their own.

How strange, mine too. I've been "interested in" (read: "painfully aware of") user experience issues ever since my first Atari ST. I've battled with every post-8-bit computer I've had on user interface issues, where your opinion appears to coincide with some fairly popular ideas I've found the hard way to be harmful. I'm sorry if I'm misinterpreting your opinion.


(a) installing
software is perceived as difficult, so not having to bother with that
is an instant plus, and (b) your data is available everywhere.

Neither is a reason to use horse-shit like HTTP, HTML, CSS or JavaScript.

You clearly didn't read anything else I wrote. Just for you I will
repeat myself:

Actually, I did. I went back and read it 3 more times before replying here. The first time around I thought you were sticking up for web apps and the "web-based OS". Apologies if I was wrong, but I'm still not sure that's not what you meant.


On 15 Jun 2010 12:51, Connor Lane Smith <c...@lubutu.com> wrote:
However, HTML and HTTP do not form a panacea. It's just the "in thing"
nowadays to write slow, buggy clones of existing software in
JavaScript and to call it innovation.

Both of
these problems could be solved with improved package management and
data synchronisation, but it's an uphill struggle, since everyone's
just fixated with the intertubes now...

How tiring. Next time I suggest reading what people actually write
before replying with snotty shit.

"Improved". You can't take a broken concept and improve it. That way lies the path of suck. As far as I can see, the notion that you can take broken concepts and glue "improvement" onto them is THE path by which the software world has become so completely flooded with 'suck'.

Package management? You're ignoring history, a lot of it. Linux distro people have been struggling to make package management work for 18 or 19 years now and they still haven't made it "easy" enough. Lindows/ Linspire had one-click installation years ago. I never used that, but I tried Ubuntu 6 months ago which had a beautiful package management user experience. I'm not being sarcastic, it was really nice. It wasn't enough. I had a hard job to find the right versions of some packages for a particular commercial app I'm stuck with, and that app already had a whole bunch of pre-built libraries to ease compatibility.

Data synchronisation? If anything this problem is worse. The issues of synchronising raw data are solved. rsync, numerous revision control systems, venti, and I'm sure many other technologies all offer more or less efficient ways to synchronise two or more stores of data. There are two hard problems: making the data meaningful to both ends and how and when to synchronise.

Making the data meaningful to both ends is not a problem if both ends have the same versions of the same software. Otherwise you're talking about a standardised data format, and standards for end-user data don't seem to work very well. There's always something which could reasonably be added.

How and when to synchronise is a harder problem. I could set up my PDA so I push a button and it synchronises with my desktop wherever in the world I might be, but what happens if I've changed data on both PDA and desktop separately? It could be fixed with user interaction, to be sure, but who would want to deal with that? I know of one system which attempts to handle such a case somewhat automatically, Plan 9's replica, but it doesn't work well enough. Plan 9 users have lost critical components of their operating system to replica, more than once. How long has the need for synchronisation been around? How old is rsync? 15 years, not quite as old as Linux package management.

Do we have a solution yet? No. We might get one tomorrow, BUT, and here is something I wish all the "improvement" fans would recognise, *expecting* research to come up with what we want is stupid. Research is finding out things, it's not necessarily going to give you what you want at all. I don't have any faith that any solution to any problem I can see may be discovered any time soon. I *am* willing to do research myself, but I an VERY careful about introducing hypothetical solutions which may never work out. It's not just noise, it actively leads people to make harmful software. I think that's how SVN got the way it did.

Another bad habit which is too common these days is taking a pretty idea and never examining it to see whether it is in fact worth anything. When thinking over the synchronisation problem today, for one whole minute I actually believed web applications were a good idea after all. And then I remembered my own experience using my own computer from the other side of the atlantic via a hotel's computer. At the time I had a 64kbit uplink. the protocol was vnc. It worked fine. There was a slight delay when moving windows, over a 64kbit link! Did I mention my home machine was running Enlightenment E16? The window manager with possibly the shiniest and certainly the most detailed themes ever seen (the one I was using was no exception), over a very detailed background, on a 400MHz AMD K6 CPU (the K6 was crap), using a standalone X+VNC server (no hardware acceleration of any kind), over a 64kbit link to a *Java* viewer, and the biggest problem was a _little_ bit of hesitation!

There's no reason to subject application developers to the horked-up mess that is web "technology". It's no solution to anything, mediocre interfaces are all over the web. As a user I don't want to be subjected to the crap I see on the web in my applications! If you want remote access, from any internet cafe and half the libraries in the world... How convenient, never worry about your laptop getting stolen! If you want such remote access get yourself a remote framebuffer viewer which works in web servers and get the web crap done once and out of the way! There's one such viewer already, in tightvnc, with a tiny httpd to serve it and all GPL iirc, or I can imagine something quite simple could be done with the html 5 canvas and a little javascript pulling from the server. If you want high availability get yourself a VPS and run your apps on that.

I find it slightly amusing, slightly sad that my old VPS provider has had describe themselves as providing cloud services, or something like that.

--
After watching Linux go nowhere for 10 years, actually watching it go backwards many times, I woke up.


Reply via email to