Good day All!

  I thought I'd post the below, as I believe its sentiments are particularly 
relevant to our discussions of late, as well as to how VI / blind users deal 
with digital access in its current incarnation. 

  I'll include both the link and the blog posting itself for convenience. 

  Be warned, this is not only a long post, but does contain the occasional bit 
of colorful vocab. So if you're sensitive to that sort of thing, then you may 
choose to pass it over. It's up to you…

  Anyway, Thanks to Josh de Lioncourt for finding this tidbit. 

  EnJoy, and have a lovely day!…

Smiles,

Cara :)

link to post:

http://stevenf.tumblr.com/post/359224392/i-need-to-talk-to-you-about-computers-ive-been
 

Blog post:

I need to talk to you about computers. I’ve been on a veritable roller-coaster 
of “how I feel” about the iPad announcement, and trying not to write about it 
until I had at least an inkling of what was at the root of that.
Before we begin, a reminder: On this blog, I speak only for myself, not for my 
company or my co-workers.
The thing is, to talk about specific hardware (like the iPad or iPhone or Nexus 
One or Droid) is to miss entirely the point I’m about to try to make. This is 
more important than USB ports, GPS modules, or front-facing cameras. Gigabytes, 
gigahertz, megapixels, screen resolution, physical dimensions, form factors, in 
fact hardware in general — these are all irrelevant to the following 
discussion. So, I’m going to try to completely avoid talking about those sorts 
of things.
Let’s instead establish some new terminology: Old World and New World computing.
Introduction
Personal computing — having a computer in your house (or your pocket) — as a 
whole is young. As we know it today, it’s less than a half-century old. It’s 
younger than TV, younger than radio, younger than cars and airplanes, younger 
than quite a few living people in fact.
In that really incredibly short space of time we’ve gone from 
punchcards-and-printers to interactive terminals with command lines to 
window-and-mouse interfaces, each a paradigm shift unto themselves. A lot of 
thoughtful people, many of whom are bloggers, look at this history and say, 
“Look at this march of progress! Surely the desktop + windows + mouse interface 
can’t be the end of the road? What’s next?”
Then “next” arrived and it was so unrecognizable to most of them (myself 
included) that we looked at it said, “What in the shit is this?”
The Old World
In the Old World, computers are general purpose, do-it-all machines. They can 
do hundreds of thousands of different things, sometimes all at the same time. 
We buy them for pennies, load them up to the gills with whatever we feel like, 
and then we pay for it with instability, performance degradation, viruses, and 
steep learning curves. Old World computers can do pretty much anything, but 
carry the burden of 30 years of rapid, unplanned change. Windows, Linux, and 
Mac OS X based computers all fall into this category.
The New World
In the New World, computers are task-centric. We are reading email, browsing 
the web, playing a game, but not all at once. Applications are sandboxed, then 
moats dug around the sandboxes, and then barbed wire placed around the moats. 
As a direct result, New World computers do not need virus scanners, their 
batteries last longer, and they rarely crash, but their users have lost a 
degree of freedom. New World computers have unprecedented ease of use, and 
benefit from decades of research into human-computer interaction. They are 
immediately understandable, fast, stable, and laser-focused on the 80% of the 
famous 80/20 rule.
Is the New World better than the Old World? Nothing’s ever simply black or 
white.
Floppy Disks
An anecdote: When the iMac came out, Apple drew a line in the sand. They said: 
we are no longer going to ship a computer with a floppy disk drive. The entire 
industry shit its pants so loudly and forcefully that you probably could have 
heard it from outer space.
Are you insane? I spent all this money on a floppy drive! All my software is on 
floppy disks! You’ve committed brand suicide! Nobody will stand for this!
Fast-forward to today. I can’t think of a single useful thing to do with a 
floppy disk. I can go to the supermarket and buy a CD, DVD, or flash drive that 
is faster, smaller, and stores 1,000 times as much data for typically less than 
a box of floppies used to cost. Or better still, we can just toss things to 
each other over the network.
To get there, yes, we had to throw away some of our investment in hardware. We 
had to re-think how we did things. It required adjustment. A bit of sacrifice. 
The end result, I think we can all agree regardless of what platform we use, is 
orders of magnitude more convenient, easier to use, and in line with today’s 
storage requirements.
Staying with floppies would have spared us the inconvenience of that transition 
but at what long-term cost?
Nothing is ever simply black or white. There was a cost to making the 
transition. But there was a benefit to doing so.
To change was not all good. To stay put was not all bad. But there was a ratio 
of goodness-to-badness that, in the long run, was quite favorable for everyone 
involved. However in the short term it seemed so insurmountable, so ludicrous, 
that it beggared the belief of a large number of otherwise very intelligent 
people.
For a species so famous for being adaptable to its environment, we certainly 
abhor change. Especially a change that involves any amount of money being spent.
Cars
John Gruber used car transmissions for his analogy, and it’s apt. When I 
learned to drive, my dad insisted that I learn on a manual transmission so I 
would be able to drive any car. I think this was a wise and valuable thing to 
do.
But even having learned it, these days I drive an automatic. Nothing is black 
and white — I sacrifice maybe a tiny amount of fuel efficiency and a certain 
amount of control over my car in adverse situations that I generally never 
encounter. In exchange, my brain is freed up to focus on the the road ahead, 
getting where I’m going, and avoiding obstacles (strategy), not the minutiae of 
choosing the best possible gear ratio (tactics).
Is a stick shift better than an automatic? No. Is an automatic better than a 
stick? No. This misses the point. A better question: Is a road full of drivers 
not distracted by the arcane inner workings of their vehicle safer? It’s 
likely. And that has a value. Possibly a value that outweighs the value offered 
by a stick shift if we aggregate it across everyone in the world who drives.
Changing of the Guard
When I think about the age ranges of people who fall into the Old World of 
computing, it is roughly bell-curved with Generation X (hello) approximately in 
the center. That, to me, is fascinating — Old World users are sandwiched 
between New World users who are both younger and older than them.
Some elder family members of mine recently got New World cell phones. I watched 
as they loaded dozens of apps willy-nilly onto them which, on any other phone, 
would have turned it into a sluggish, crash-prone battery-vampire. But it 
didn’t happen. I no longer get summoned for phone help, because it is 
self-evident how to use it, and things just generally don’t go wrong like they 
used to on their Old World devices.
New Worlders have no reason to be gun-shy about loading up their device with 
apps. Why would that break anything? Old Worlders on the other hand have been 
browbeaten to the point of expecting such behavior to lead to problems. We’re 
genuinely surprised when it doesn’t.
But the New World scares the living hell out of a lot of the Old Worlders. Why 
is that?
The Needs of the Few
When the iPhone came out, I was immediately in love, but frustrated by the lack 
of an SDK. When an SDK came out, I was overjoyed, but frustrated by Apple’s 
process. As some high-profile problems began to pile up, I infamously railed 
against the whole idea right here on this very blog. I announced I was 
beginning a boycott of iPhone-based devices until changes were made, and I 
certainly, certainly was not going to buy any future iPhone-based products. I 
switched to various other devices that were a bit more friendly to Old Worlders.
It lasted all of a month.
For as frustrated as I was with the restrictions, those exact same restrictions 
made the New World device a high-performance, high-reliability, absolute 
workhorse of a machine that got out of my way and just let me get things 
accomplished.
Nothing is simply black or white.
Old Worlders are particularly sensitive to certain things that are simply 
non-issues to New Worlders. We learned about computers from the inside out. 
Many of us became interested in computers because they were hackable, open, and 
without restrictions. We worry that these New World devices are stifling the 
next generation of programmers. But can anyone point to evidence that that’s 
really happening? I don’t know about you, but I see more people carrying 
handheld computers than at any point in history. If even a small percentage of 
them are interested in “what makes this thing tick?” then we’ve got quite a few 
new programmers in the pipeline.
The reason I’m starting to think the Old World is ultimately doomed is because 
we are bracketed on both sides by the New World, and those people being born 
today, post-iPhone and post-iPad, will never know (and probably not care) about 
how things used to work. Just as nobody today cares about floppies, and nobody 
has to care about manual transmissions if they don’t want to.
If you total up everyone older than the beginning of the Old World, and every 
person yet to be born, you end up with a much greater number of people than 
there are in the Old World.
And to that dramatically greater number of people, what do you think is more 
important? An easy-to-use, crash-proof device? Or a massively complex tangle of 
toolbars, menus, and windows because that’s what props up an entrenched 
software oligarchy?
Fellow Old Worlders, I hate to tell you this: we are a minority. The question 
is not “will the desktop metaphor go away?” The question is “why has it taken 
this long for the desktop metaphor to go away?”
But, But I’m a Professional!
This is a great toy for newbies, but how am I supposed to get any SERIOUS work 
done with it? After all, I’m a PRO EXPERT MEGA USER! I MUST HAVE TOOLBARS, 
WINDOWS, AND…
OK, stop for a second.
First, I would put the birth of New World computing at 2007, with the 
introduction of the iPhone. You could even arguably stretch it a bit further 
back to the birth of “Web 2.0” applications in the early 2000s. But it’s brand 
new. If computers in general are young, New World computing is fresh out of the 
womb, covered in blood and screaming.
It’s got a bit of development to go.
I encourage you to look at this argument in terms of what you are really trying 
to achieve rather than the way you are used to going about it.
Let’s pick a ridiculous example and say I work in digital video, and I need to 
encode huge amounts of video data into some advanced format, and send that off 
to a server somewhere. I could never do that on an iPad! Right?
Well, no, today, probably not. But could you do it on a future New World 
computer in the general sense?
Remember, the hardware is a non-issue: Flash storage will grow to terabytes in 
size. CPUs will continue to multiply in power as they always have. Displays, 
batteries, everything will improve given enough time.
As I see it, many of these “BUT I’M AN EXPERT” situations can be resolved by 
making just a few key modifications:
        1.      A managed way of putting processes in the background. New 
Worlders are benefiting already from the improved performance and battery life 
provided by the inability to run a task in the background. Meanwhile, Old 
Worlders are tearing their hair out. I CAN’T MULTITASK, right? It seems like 
there has to be a reasonable middle ground. Maybe processes can petition the OS 
for background time. Maybe a user can “opt-in” to background processes. I don’t 
know. But it seems like there must be an in-between that doesn’t sacrifice what 
we’ve gained for some of the flexibility we’re used to.
        2.      A way of sharing data with other devices. New World devices are 
easy to learn and highly usable because they do not expose the filesystem to 
users and they are “data islands”. We are no longer working with “files” but we 
are still working with data blobs that it would be valuable to be able to 
exchange with each other. Perhaps the network wins here. Perhaps flash drives 
that we never see the contents of. The Newton was, to my knowledge, the first 
generally available device where you could just say “put this app and all data 
I’ve created with it on this removable card” without ever once seeing a file or 
a folder. Its sizable Achilles’ Heel was that only other Newtons understood the 
data format.
        3.      A way of sharing data between applications. Something like the 
clipboard, but bigger. This is not a filesystem, but a way of saying “bring 
this data object from this app to this app”. I’ve made this painting in my 
painting app, and now I want to bring it over here to crop it and apply filters.
By just addressing those three things (and I admit they are not simple feats), 
I think all but the absolutely most specialized of computer tasks become quite 
feasible on a New World device.
A Bet on the Future
Apple is calling the iPad a “third category” between phones and laptops. I am 
increasingly convinced that this is just to make it palatable to you while 
everything shifts to New World ideology over the next 10-20 years.
Just like with floppy disks, the rest of the industry is quite content to let 
Apple be the ones to stick their necks out on this. It’s a gamble to be sure. 
But if Apple wins the gamble (so far it’s going well), they are going to be 
years and years ahead of their competition. If Apple loses the gamble, well, 
they have no debt and are sitting on a Fort Knox-like pile of cash. It’s not 
going to sink them.
The bet is roughly that the future of computing:
        1.      has a UI model based on direct manipulation of data objects
        2.      completely hides the filesystem from the user
        3.      favors ease of use and reduction of complexity over absolute 
flexibility
        4.      favors benefit to the end-user rather than the developer or 
other vendors
        5.      lives atop built-to-specific-purpose native applications and 
universally available web apps
All in all, it sounds like a pretty feasible outcome, and really not a bad one 
at that.
But we Old Worlders have to come to grips with the fact that a lot of things we 
are used to are going away. Maybe not for a while, but they are.
Will the whole industry move to New World computing? Not unless Apple is 
demonstrably successful with this approach. So I’d say you’re unlikely to see 
it universally applied to all computing devices within the next couple of 
decades.
But Wednesday’s keynote tells me this is where Apple is going. Plan accordingly.
How long will it take to complete this Old World to New World shift? My guess? 
The end is near when you can bootstrap a new iPad application on an iPad. When 
you can comfortably do that without pining for a traditional desktop, the days 
of Old World computing are officially numbered.
The iPad as a particular device is not necessarily the future of computing. But 
as an ideology, I think it just might be. In hindsight, I think arguments over 
“why would I buy this if I already have a phone and a laptop?” are going to 
seem as silly as “why would I buy an iPod if it has less space than a Nomad?”
---
View my Online Portfolio at:

http://www.onemodelplace.com/CaraQuinn

Follow me on Twitter!

https://twitter.com/ModelCara

-- 
You received this message because you are subscribed to the Google Groups 
"MacVisionaries" group.
To post to this group, send email to macvisionar...@googlegroups.com.
To unsubscribe from this group, send email to 
macvisionaries+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/macvisionaries?hl=en.

Reply via email to