Thanks for this great post, Peter.
Le 2 août 2012 à 09:16, Peter Alcibiades a écrit :
> The
> mistakes people make when trying to use what we have written is probably the
> most valuable test.
--
Pierre Sahores
mobile : 06 03 95 77 70
www.sahores-conseil.com
___
On 02/08/2012 08:16, Peter Alcibiades wrote:
If the HIG are not scientifically provable "usability", but simply
subjective statements, then how can we measure "usability"?
The enterprise is fundamentally mistaken. We have to start by recognizing
there is no such thing. One size does not fi
d that people were opening the same
application twice, and had to correct what was giving rise to that. The
mistakes people make when trying to use what we have written is probably the
most valuable test.
--
View this message in context:
http://runtime-revolution.278305.n4.nabble.com/OT
On 8/1/12 7:10 PM, Alejandro Tejada wrote:
Well, here is another link with content related to this thread:
http://www.codinghorror.com/blog/
Read first the really funny article: New Programming Jargon
I have created Jenga code myself! And nearly all the HyperCard ports
I've done contain lots o
Alejandro-
I once had an email account at mwie...@mail.dotcom.com. Try saying
that to someone over the phone. I wasn't all that disappointed when
they went under.
--
-Mark Wieder
mwie...@ahsoftware.net
___
use-livecode mailing list
use-livecode@list
/smurf/smurflog.log
http://www.codinghorror.com/blog/
Mark Wieder wrote
>
> ! I am so adding a rubber duck to my office in the morning...
>
--
View this message in context:
http://runtime-revolution.278305.n4.nabble.com/OT-How-long-before-tp4653161p4653262.html
Sent from the R
On Aug 1, 2012, at 12:34 PM, Bob Sneidar wrote:
> Sorry this is one of my many pet peeves. Everything hinges on what you mean
> by "learn" and "act". Tell me this, what new thing has a computer learned
> that no human knew before? And how did the computer act on that new
> knowledge? I think AI
Alejandro-
Wednesday, August 1, 2012, 5:10:33 PM, you wrote:
> Well, here is another link with content related to this thread:
> http://www.codinghorror.com/blog/
> Read first the really funny article: New Programming Jargon
! I am so adding a rubber duck to my office in the morning...
--
-Mar
Well, here is another link with content related to this thread:
http://www.codinghorror.com/blog/
Read first the really funny article: New Programming Jargon
Al
--
View this message in context:
http://runtime-revolution.278305.n4.nabble.com/OT-How-long-before-tp4653161p4653257.html
Sent from
On 08/01/2012 07:42 PM, Bob Sneidar wrote:
These equations are all imagined. In the end, the "intelligent" computer would
just be doing exactly what it was programmed to do. As I said before, the appearance of
random acts would be purely illusionary. A human can feel a mood swing coming on, and
Hi Tom,
Yes. It pretty much had web / doc views and a hierarchy of menu screens to
navigate them.
Interesting. Thanks for sharing.
On Wednesday, August 1, 2012, Thomas McGrath III wrote:
> [OT] from the [OT]:: Chipp, did the app 'just' have web views and or doc
> pages with like a splash scree
take place in American industry
> or it will continue on the decline until the style of American management
> changes...and they don't know what to do. 98% don't know there is a
> problem or there is anything they can do."
>
> Here is more:
> http://www.endsofthe
I tend to think that is understanding. Wisdom to me is the ability to decide
what to do about it.
Bob
On Jul 31, 2012, at 6:41 PM, Alejandro Tejada wrote:
> Hopefully they will learn a lesson from those who really
> knew best, like Mr. Russell Lincoln Ackoff:
>
> "Wisdom is the ability to se
That is basically what I was saying when I said that the AI we have
seen is primitive and it was artificial in even calling it AI.
In the near future you will be able to put all of the info in the world
on a small dot. With that intelligence the dot will have more power
than every computer and ev
These equations are all imagined. In the end, the "intelligent" computer would
just be doing exactly what it was programmed to do. As I said before, the
appearance of random acts would be purely illusionary. A human can feel a mood
swing coming on, and then make a choice as to whether or not to
Sorry this is one of my many pet peeves. Everything hinges on what you mean by
"learn" and "act". Tell me this, what new thing has a computer learned that no
human knew before? And how did the computer act on that new knowledge? I think
AI is an illusion, produced by the old trick of bait and sw
I think that is like saying a seller of homes is a complete paranoid fanatic
because he decided to upgrade the locks and put in the latest state of the art
alarm systems. People who try to make this argument keep saying "every" like it
was a fact that was just known and beyond dispute. To my kno
If you try to compare iOS with a desktop OS, then yeah, all your conclusions
are going to be skewed. Apple's intention was not necessarily to make a full
blown production OS, comparable to modern OS environments. Perhaps we should
also add to the equation developer intent. If users bought iOS de
If did not use the phrase "at this time" it would obviously limit
my ability to learn and explain things in the future.
But given mood swings tend to cause irrational thinking as can
be seen in the "my girfriend" phrase it should not be too hard for
a computer to use certain events and respond irr
On 08/01/2012 05:50 PM, ambassa...@fourthworld.com wrote:
Richmond wrote:
One of the initial premises of Linux was that folks could muck it
around to their heart's content, especially with regard to the GUI;
if this is lost and/or removed a very great part of the appeal of
Linux for the average
On 08/01/2012 05:45 PM, Mike Bonner wrote:
Hasn't the quantification of emotion already been solved? Quite a while ago
I think. A mac had a pretty stable AI. Mostly it just works. But push it
to the point of upset? Instant sad face.
Atari ST was a much more militant AI. When it was unhappy o
On 08/01/2012 05:41 PM, -=>JB<=- wrote:
No, I cannot explain it at this time.
The thing that I find fussing is the boundless optimism of very many
people that the reduction of everything (as per Carnap) to Mathematical
descriptions will eventually happen.
This is best pointed out by the phr
Richmond wrote:
One of the initial premises of Linux was that folks could muck it
around to their heart's content, especially with regard to the GUI;
if this is lost and/or removed a very great part of the appeal of
Linux for the average Joe will be lost.
You have more than a hundred distros t
Hasn't the quantification of emotion already been solved? Quite a while ago
I think. A mac had a pretty stable AI. Mostly it just works. But push it
to the point of upset? Instant sad face.
Atari ST was a much more militant AI. When it was unhappy or angry it
would throw bombs on the screen, t
No, I cannot explain it at this time.
-=>JB<=-
On Aug 1, 2012, at 7:34 AM, Richmond wrote:
> On 08/01/2012 05:20 PM, -=>JB<=- wrote:
>> And I was being serious about mood swings and AI.
>
> Ahah. Well, in that case can you explain how mood swings can be reduced to a
> set of mathematical
> eq
On 08/01/2012 05:20 PM, -=>JB<=- wrote:
And I was being serious about mood swings and AI.
Ahah. Well, in that case can you explain how mood swings can be reduced
to a set of mathematical
equations?
-=>JB<=-
On Aug 1, 2012, at 7:14 AM, Richmond wrote:
On 08/01/2012 05:15 PM, -=>JB<=- wr
And I was being serious about mood swings and AI.
-=>JB<=-
On Aug 1, 2012, at 7:14 AM, Richmond wrote:
> On 08/01/2012 05:15 PM, -=>JB<=- wrote:
>> I didn't watch that much Star Trek but I think Spock would disagree
>> with your theory about mood swings and AI.
>
> Oddly enough, I was being se
On 08/01/2012 05:15 PM, -=>JB<=- wrote:
I didn't watch that much Star Trek but I think Spock would disagree
with your theory about mood swings and AI.
Oddly enough, I was being serious about AI, not making goofy remarks to
be analysed by characters from a TV series.
-=>JB<=-
On Aug 1, 20
I didn't watch that much Star Trek but I think Spock would disagree
with your theory about mood swings and AI.
-=>JB<=-
On Aug 1, 2012, at 6:48 AM, Richmond wrote:
> On 08/01/2012 04:31 PM, -=>JB<=- wrote:
>> The AI we have had in the past is primitive and not even close to the AI
>> I am talki
On 08/01/2012 04:31 PM, -=>JB<=- wrote:
The AI we have had in the past is primitive and not even close to the AI
I am talking about. i have not seen anything fantastic come from the old
so called AI because it was artificial in even claiming it to be AI.
Artificial Intelligence if we mean s
The AI we have had in the past is primitive and not even close to the AI
I am talking about. i have not seen anything fantastic come from the old
so called AI because it was artificial in even claiming it to be AI.
As I have said, trial and error are a part of development. Many have good
reason
[OT] from the [OT]:: Chipp, did the app 'just' have web views and or doc pages
with like a splash screen? The reason I ask is we had to put a simple game into
some of our apps to get past the 'this could have been done in HTML5' objection
and I was wondering if this was a similar case for you.
It's the little things that hurt. They ruined the really wonderful feature of
"spaces" in Snow Leopard, which allowed you to have a grid of separate desktops
in two dimensions. In Lion, no grid, it's a linear string, and it doesn't even
wrap around, so to get from one on the far right to one on
Like I said, I agree in general...but what /should/ a good measure of a
successful interface be?
If you'd like to think of it that way, Windows (for all its faults) /is/
touched by the gods. My children have been getting ICT training at
school for the last four years, and what have they been
Björnke, thanks for the link!
I predicted here in this forum a couple years back this would happen on the
Mac. Apple would end up being the Gatekeeper of all the allowed software on
anyone's computer.
And, soon, just like it happened with Windows Vista, folks will just get
used to turning Gatekee
Yes, in theory I also like the idea of just talking to my computer and have
it understand what I say and just do it. The problem is, when things don't
work 100% right, people tend to get upset. It's actually one of the reasons
iPads don't do handwriting recognition. Apple tried it with Newton and
f
This is the way Apple tries to make the OS virus-proof. It's also tying into a
marketing idea, to bring every App under their app store umbrella, and a way to
control every applications purpose and function. Ethical, and Big Brother
considerations aside, for now, there's a setting in the prefere
Yes Richmond, you are correct. The AI ship seemed to sail quite some time
ago. One of the top AI scientists, Doug Lenat, has been working on creating
an AI entity, Cyc (www.cyc.com) and wrote an interesting article: "The
Voice of the Turtle: Whatever Happened to AI?" which is an interesting read:
h
I love this thread, and all the old guys emphasising their thoughts and
emotions. However, you are all geezering on and on about the good old times,
how everything used to be much better when there was BASIC and CLI (or a
slightly less remote past). In theory, I like the idea of no "save as", an
I certainly would not consider 10% Mac marketshare as 'success'? If that is
the case, then Windows would be considered an interface personally touched
by the gods. And it can be argued iOS is 2 steps forward and 3 steps
backward. The fact is, it is not an iOS focussed on productivity. With no
file
Brillant. I concur.
On Wed, Aug 1, 2012 at 2:40 AM, Peter Alcibiades <
palcibiades-fi...@yahoo.co.uk> wrote:
> they have moved
> from thinking of the UI and the OS as a whole as being something which
> offers services to the users, to thinking of them as something which
> governs
> and controls t
On 01/08/2012 08:40, Peter Alcibiades wrote:
Its not a problem that is confined to Apple - though Apple maybe sets the
tone. You can see it in Linux too. Both Gnome 3 and KDE 4 have gone
through a phase of total user interface redesign. In both cases the result
was pretty unusable - though it
On 08/01/2012 12:44 PM, Pierre Sahores wrote:
Probably, yes ;-)
Le 1 août 2012 à 01:35, -=>JB a écrit :
Trial and error are a part of developing. AI will be here soon and you will
I thought Artificial Insemination had been here for at least 40 years. LOL.
probably be thankful for its eas
Probably, yes ;-)
Le 1 août 2012 à 01:35, -=>JB a écrit :
> Trial and error are a part of developing. AI will be here soon and you will
> probably be thankful for its ease in use and it will make programming even
> easier too.
>
> We are on the verge of fantastic changes that might come overnig
On 08/01/2012 10:40 AM, Peter Alcibiades wrote:
Its not a problem that is confined to Apple - though Apple maybe sets the
tone. You can see it in Linux too. Both Gnome 3 and KDE 4 have gone
through a phase of total user interface redesign. In both cases the result
was pretty unusable - though
't suppose Apple will however. The habit is too ingrained, and
OSX becoming such a small part of the business that it will not attract the
level of management attention required to force the team to change.
--
View this message in context:
http://runtime-revolution.278305.n4.nabble.com/OT-How
Thanks Thomas and everyone for your thoughts on this subject. Very
interesting to read!
I agree with much which has been said. I also agree with the old chestnut,
"if it ain't broke, don't fix it." For over 20 years, Save As was good
enough for all– and now it's not.
I don't agree about whole cha
anges...and they don't know what to do. 98% don't know there is a
problem or there is anything they can do."
Here is more:
http://www.endsoftheearth.com/Deming14Pts.htm
Al
--
View this message in context:
http://runtime-revolution.278305.n4.nabble.com/OT-How-long-before-tp4653161p
Alejandro-
Tuesday, July 31, 2012, 6:41:24 PM, you wrote:
> This topic brings me memories of this documentary:
> La Obsolescencia Programada Fabricados para no durar
> http://www.youtube.com/watch?v=chJT_uxSqNk
Thanks. That was great. I didn't know about the East German lightbulb
factory or the
Tom-
The Story of the Ribbon is online. It's an interesting thing to watch,
especially through all the pain of why new Word menu interfaces didn't
work, but it still doesn't make it right.
https://blogs.msdn.com/b/jensenh/archive/2008/03/12/the-story-of-the-ribbon.aspx?Redirected=true
--
-Mark
On 7/31/12 9:51 PM, Thomas McGrath III wrote:
Sorry for that long email. I didn't really think it was till i just saw it now.
Wow….
Don't apologize. It was excellent. Well written and, more important, true.
Lion disrupted 30 years of computing habits for much the same reasons. I
hate it.
-
Sorry for that long email. I didn't really think it was till i just saw it now.
Wow….
-- Tom McGrath III
http://lazyriver.on-rev.com
3mcgr...@comcast.net
On Jul 31, 2012, at 10:42 PM, Thomas McGrath III wrote:
> And don't we all at some point think "We know best". The problem is there are
> m
And don't we all at some point think "We know best". The problem is there are
many many people that think they know best in direct conflict with those of us
that actually do. ;-)
I have studied Fitz law for many years (20 plus years) in developing software
and systems for people who can not spe
Al
--
View this message in context:
http://runtime-revolution.278305.n4.nabble.com/OT-How-long-before-tp4653161p4653178.html
Sent from the Revolution - User mailing list archive at Nabble.com.
___
use-livecode mailing list
use-livecode@lists.runre
I think your summary hits the nail on the head. "They think they know best."
We are living in times where there is a common thinking that life--at both
societal and individual levels--should be engineered by experts.
I don't want to get caught up in that. When I reflect on my own designs I h
That's good to know. Preview uses command-shift-S to Duplicate the current
document, but sure enough, holding down option as well changes it from
Duplicate to Save As…
On Jul 31, 2012, at 6:53 PM, J. Landman Gay wrote:
> I haven't upgraded yet but I was just reading about this. The hue and c
On Tue, Jul 31, 2012 at 7:39 PM, Chipp Walters wrote:
> So, I'm wondering... how long before we quit upgrading everything and start
> sticking to a single legacy OS and/or programs?
>
I was pretty happy using Snow Leopard but Apple decided that you could only
ship Mac and iOS software to their
I like this thread (and will try to keep opinions short...).
IMO, long term, I don't think the OS developers care about legacy/power
users -- as you say, they think they know best. Many people have carelessly
tossed around some variation of the Gretsky quote "Don't skate to where the
puck is, ska
The future is progressive thinking. They are working diligently at making
things more intuitive and you don't get there without problems.
The more they work at it the better it will be and these are minor problems
considering what they will accomplish to make things easier & quicker in
the next f
I know EXACTLY how you feel. In Microsoft's case, they are obligated by
contract to put out a new version of everything they support with their
Software Assurance program. Imagine charging x for 3 years of software
assurance, and then not producing a major upgrade in that time. The outcry
would
On 7/31/12 5:39 PM, Chipp Walters wrote:
I was talking with Chris about this base on:
1) Mountain Lion
...
there's still no "Save As.."
I haven't upgraded yet but I was just reading about this. The hue and
cry was so loud that Apple put "save as" back in, carefully hidden so
that those po
> At what point
> do the OS'es get so much in the users way that they're no longer good for
> us legacy power users?
Apple won't give us the option if we want to deploy to the latest devices we
will need the latest desktop.
--
M E R Goulding
Software development services
Bespoke application dev
I was talking with Chris about this base on:
1) Mountain Lion now tries to do automatic tasks, like download stuff when
the lid is closed and you're on battery power, thus killing your battery
without you knowing. And now with Gatekeeper complaining about anything you
try and install-- and wt–, iC
63 matches
Mail list logo