Robert Benschop The Humane Computer

First of all, I'm not a developer, I'm not a specialist in any specific kind, I'm just an enthusiastic Newton user, or rather a Newton evangelist. I've converted loads of people over the years to use Newtons because of my conviction that this is the best platform designed so far in the history of computing. So I would like to share with you why this is and what I've seen over the years from the sidelines, mostly from a users perspective.

From the the moment that computers entered the world they have been an inspiration for the creative fantasy of people, from most of the science fiction writers to Gene Roddenberry's Star Trek. Especially Star Trek has managed the capture what most people still feel a computer should be like: a seemingly simple device that can execute otherwise tedious tasks in fractions of milliseconds. Moreover it shows the most simple of interfaces; one can just ask or give orders like one would do with another human being, and as long as the computer has the possibility to actually execute these it orders it will.
As a side note: it might be interesting to note that the computer in Star Trek often takes centre stage when it's in a situation that it actually can't execute it's orders, something that some code writers seem to have taken too much as an inspiration.

In reality the computers that were in use in 1966, the time that the first Star Trek series aired for the first time, still used tapes and cards with little punch holes and were interfaced with from a command line interface, so by code.
A far cry of the dreams of Roddenberry and the writers of the series.

In server and mainframe computing, the level of the people working with the computers has always been such, that there hasn't been much of a necessity to move away from this, though there have been improvements there as well.
The field where a much more dramatic change in interface was needed was in the field of personal computing.

If the general audience was to adopt and use the computer it simply should have a much easier and humane interface.

In the beginning of the 80's the field was reigned, both for mainframe as personal computing which was still quite young then, by computers controlled by a command line interface, so typed commands was the way to go.
And hence personal computers were only used by people who took the effort to learn this, keeping them away from the general audience.
Though there was some acceptance, both in the professional field were people were seeking to cut costs and mostly of private citizens who liked the technological advancement.

In 1983 the first and biggest revolution in Personal Computing so far took place, when Apple launched the Apple Lisa, the first computer to use a Graphical User Interface (GUI). This GUI was actually originally developed at the Xerox Palo Alto Research Center and bought by Apple.
But Apple didn't license it's OS and kept the prices of their computers relatively high, making a handsome profit on the way but keeping it from the larger part if the general audience.

Apple users would laugh of people using MS-DOS and MS-DOS users would argue that they didn't need all that fancy stuff, proving otherwise themselves with the runaway success of Windows 95, the first Windows version that offered a relatively stable environment accessible through a GUI. The word stable here is used relative to the earlier Windows version, not compared to other platforms.

So the general audience itself has proven the point, if the price is low enough, they will choose the cheapest version of a product or at least one with a reasonable price compared to the rest of the market of which the interface is the best. Or, in other words, which is easiest to use.
Recently this point has been proven again by the runaway success of the iPod, though not the cheapest, it is one of the best in it's field with one of the best interfaces and a price that isn't too far from the competing offers in the market.
Of course I'm not going into the details of the marketing of both the iPod and Windows 95, because that would take up another paper, other than mentioning that a good product that isn't marketed right will rarely make it.
I will refer to this again a little later when we come to the Newton.

So the GUI was the first step to a humane computer, one that we can interact with on a natural level.

Now the paradox is that so many people have gotten so used to use their computers the way they use them, that it's really hard to identify the problems they actually have interfacing with them.
To take the example of the iPod once more: most people didn't have a clue that their mp3 player was quite awful to use, until they realized that there was a player out there that did the job as well or better, but with much less hassle.

The strangest thing in all this is that in the meantime Star Trek goes through new episodes with new actors and the computer keeps interfacing in a normal way and never seems to have a virus, while the day to day reality of most people is the opposite but is still accepted. I guess it's all the aliens and science fiction stuff clouding our vision.

From the first days of the technological revolution there has always been a huge discrepancy between what the tech geeks thought up, and what the general audience has perceived as user friendly technology.
Or, in other words, most technology has been plain awful to use for normal mortals like you and me.

As an recent example: early this summer the top of Philips Electronics took their own products home to try them out for a weekend. Much to their shock and dismay most of them couldn't figure out most functions and they found their own product much too complicated.
They immediately issued directions that their technology should be much easier to use.
While this might be comforting in the long run, if it isn't forgotten soon, in the long run this won't do any good because these kind of changes take a long time to take hold in most large electronics companies.

In the meantime we have seen quite a few promises of technology that was going to make it to the mainstream and was to change the way we interact with computers but never did, virtual reality the most prominent amongst them.

Taking us back to the point in history were we took off: in the middle of all of this, actually two years before Windows 95, there was the second revolution on personal computing. And it went largely unnoticed by the general audience, and it was the birth of the Newton and the Newton OS. It was of course noticed by the technological community and the Newton OS received an award at the CEBIT, but the audience in general mostly took notice of the shortcomings of the hand writing (HWR) engine thanks to Gary Trudeau and his Doonesbury cartoon, and the improvements in the HWR later on went largely unnoticed.

Of course there were other reasons: again the marketing was far from perfect, a problem Apple had for quite a while and up to a point still suffers from and the prize point was very high at the time of release.
But here we won't focus on the reasons of the relative commercial failure, but instead on what made and still makes the Newton OS a great OS and why it's still used by thousands of people around the globe.


Robert Benschop is a professional photographer. He probably epitomizes the Newton Power User. The Newton never failed to astound him for all these years and he will explain why the Newton is still the leader of the pack end.