It's interesting to see that the familiar acronym "ram" existed already: random-access memory. This was an annoying term, because it gave the impression that the contents of the storage device were not in fact accessed in a strictly determined fashion, but a little like throwing a dice. That, of course, was not really true. The adjective "random" was an example of primordial IBM marketing buzz.
Showing posts with label IBM. Show all posts
Showing posts with label IBM. Show all posts
Sunday, March 6, 2016
Hard disks were born in 1956
Hard disks were created for the first time by IBM in 1956, half a century ago, at about the same time that I started to learn computer programming. The first model was called Ramac, and it was bulkier and heavier than a grand piano.
Personally, I was aware of the existence of such storage devices, but I never actually used one. Seeing the clumsy way in which they were transported by a tiny team of human workers, I imagine that these fragile devices were surely in a state of breakdown for much of their existence.
It's interesting to see that the familiar acronym "ram" existed already: random-access memory. This was an annoying term, because it gave the impression that the contents of the storage device were not in fact accessed in a strictly determined fashion, but a little like throwing a dice. That, of course, was not really true. The adjective "random" was an example of primordial IBM marketing buzz.
It's interesting to see that the familiar acronym "ram" existed already: random-access memory. This was an annoying term, because it gave the impression that the contents of the storage device were not in fact accessed in a strictly determined fashion, but a little like throwing a dice. That, of course, was not really true. The adjective "random" was an example of primordial IBM marketing buzz.
Wednesday, February 17, 2016
Who's the American presidential candidate called Watson?
This AI software became famous when it succeeded in defeating a human contestant to win America's favorite quiz show, Jeopardy.
Since then, there has been a steady US buzz of superlatives aimed at convincing the people of the world (well, let's say, the people of God's Own Country) that this software tool is... well, awesome.
Personally, I got to know IBM quite well, having started my professional career in programming with that company in Australia, in the years 1957 to 1961, before working with their programming teams in Paris and London, in 1962 and 1963. Since then, I've also become quite familiar with the field of artificial intelligence. Well, in my humble opinion, much of what we hear from IBM as far as AI is concerned can be brushed aside as pure marketing buzz, business-oriented hype.
Friday, January 29, 2016
Place in North Sydney where I met up in 1957 with my first IBM computer
Towards the end of 1957, after my second year of studies in the Faculty of Science at the University of Sydney, my student friend Michael Arbib informed me of his recent encounter with the Australian branch of a US company named IBM. Michael had been offered a vacation job with this company, and he invited me to make a similar request. And that's how, in a brand-new North Sydney skyscraper (in 1957, the tallest building in the southern hemisphere), I came to meet up with the IBM 650 machine and a programming language called Fortran. I was therefore just over 17 years old when I started my life-long activities as a computer programmer.
At that time, to travel between the IBM offices and the central Sydney business zone, I used to take a tram across the bridge.
The Miller Street building still exists today, looking small and old-fashioned in the vicinity of modern constructions.
Click to enlarge
Click to enlarge
In that photo of a pair of tram lines on the eastern (Pacific Ocean) side of the bridge, we're looking south towards the main city. The tram on the right is moving northwards to the destination indicated below the driver's window: Frenchs Road in the nearby suburb of Willoughby, just beyond North Sydney. Further to the left, we catch a glimpse of the rear end of a tram moving towards the city, whose surprisingly low skyline can be seen further on. Those two tram lines were soon replaced—as Sydney residents now realize—by automobile lanes.
Today, as I sit here in the French countryside, in front of my computer, it's most moving for me to write a few lines about that distant corner of the world where I came into contact with an archaic IBM computer in 1957. In fact, I spent little time at that North Sydney address, because the company soon moved to a more convenient building in Palmer Street, Darlinghurst. It was there that I worked for much of the time (followed by a short period in the Lidcombe offices of IBM) up until my departure for the Old World at the start of 1962.
Labels:
computer programming,
computing,
IBM,
Sydney
Sunday, July 31, 2011
Enough cash to buy the USA
When I was working with IBM Australia back in Sydney during the period 1957-1961, I remember being most impressed by an anecdote designed to reveal the fabulous prosperity of my US employer. Somebody told me that IBM was so wealthy that the corporation could simply pay cash for such-and-such a South American nation… in the "banana republic" category, if I remember rightly. At the time, I wouldn't have been capable of deciding whether or not this was rubbish talk, so I simply believed what I was told, and got on (proudly, no doubt) with my computer programming tasks.
These days, thanks to Internet, we're more cautious about tales of this kind, since people are more and more capable of verifying the degree of truth in what is being stated. We're no longer obliged to survive in the kind of informational vacuum that shrouded the planet up until recently… except, of course, if your antiquated beliefs, your inbuilt mental structure and your cultural conditioning force you (with or without Internet) to do so.
Today, we're told (and it's no doubt true) that the Apple corporation disposes of cash liquidities of 76 billion dollars, whereas those of the entity known as the USA amount to 73 billion dollars. The latter sum represents what the USA can actually spend before they hit their official national debt limit of well over 14 billion dollars, illustrated here:
It's said that, if the current US debt were to be materialized in 100-dollar banknotes, the stack of greenbacks would cover a football field up to the height of the left arm of the Statue of Liberty. This explains why a dynamic corporation such as Apple would never—in spite of having enough ready cash to do so—invest in such an unpromising financial affair as God's Own Country.
These days, thanks to Internet, we're more cautious about tales of this kind, since people are more and more capable of verifying the degree of truth in what is being stated. We're no longer obliged to survive in the kind of informational vacuum that shrouded the planet up until recently… except, of course, if your antiquated beliefs, your inbuilt mental structure and your cultural conditioning force you (with or without Internet) to do so.
Today, we're told (and it's no doubt true) that the Apple corporation disposes of cash liquidities of 76 billion dollars, whereas those of the entity known as the USA amount to 73 billion dollars. The latter sum represents what the USA can actually spend before they hit their official national debt limit of well over 14 billion dollars, illustrated here:

Sunday, April 17, 2011
Moving into a troubled city


Within a few days of my settling down in Paris, I was brought face-to-face with the realities of living in a city in which plastic explosives were being detonated by insurrectionists, intending to draw attention to nasty events on the other side of the Mediterranean. One evening, as I opened the door into my tiny hotel room in the Rue des Ecoles (just a few hundred meters away from the Sorbonne), an explosion destroyed a bookshop on the other side of the street. I remember the familiar horn signals of police vehicles against the delicate tinkling (like proverbial Xmas sleigh bells) of glass fragments falling from shattered windows in the vicinity of the targeted bookshop. A few days later, when I arrived at the IBM building in the Cité du Retiro (just near the Elysée Palace), I learned that an explosion had occurred there during the night. A month later, everything calmed down overnight when the president Charles de Gaulle signed a peace agreement with the FLN on 18 March 1962 at Evian-les-Bains, in the French Alps.




I would not actually receive the desired document for another three months. During that time, IBM arranged a contact for me in London (since only a French consulate in a foreign land could actually instigate the issue of a work permit to a non-French individual), and it was planned that, as soon as this London contact received a consular request demanding my presence for an interview, I was to drop everything I was doing and jump onto an Air France Caravelle bound for London, enabling me to turn up at the consulate as if I had just taken the London Underground to get there. That trick—which necessitated no less than three return trips to London—enabled me to carry on working for IBM in Paris in spite of the fact that I did not yet possess a work permit. Obviously, everybody—both at the Paris prefecture and at the consulate in London—knew that I was playing a silly game, but we were obliged to behave like that in order to obtain the precious document in a manner that was superficially legal… which was finally issued to me on 15 May 1962.

We must not, however, exaggerate. If the French authorities had really wanted to make it easy for me to work legally in France, they would have simply handed me a work permit, instead of expecting me to wander around in their red-tape world (of the Paris prefecture and the London consulate) for three months before issuing me a lousy temporary work permit. In any case, it's almost certain that many French visionaries (including de Gaulle) sensed that the intriguing computer phenomenon, represented ideally by IBM, would no doubt play a role in the industrial, scientific and economic future of France.
POST SCRIPTUM: It goes without saying that the work for which I was employed by IBM Europe (programming the IBM 1401 computer), from 12 February 1962 up until 28 September 1962, had nothing whatsoever to do with the above-mentioned punched-card project carried out by IBM France with a view to controlling the Algerian population residing in France at that time. IBM was an emanation—as is well known—of the Hollerith punched-card company, whose most celebrated primordial exploit in data processing (as this activity came to be called) entailed the use of punched cards to process the results of the US census of 1890. So, there was nothing particularly exceptional in Papon's use of this same punched-card support, some 70 years later, to store data concerning people in France. As for Maurice Papon, he was finally condemned and jailed for his role in the deportation of Jews from Bordeaux during the Nazi Occupation, and he was also stigmatized (but never actually pursued in a law court) for the murky aspects of his treatment of Algerians. But it would be an absurd deduction to imagine that there might have been anything intrinsically evil, a priori, in the above-mentioned IBM punched-card project. On the other hand, all this precise and well-organized police data concerning FLN suspects, placed conveniently at the fingertips of Papon, would have certainly made it easier for him to perpetrate evil deeds.
Wednesday, January 26, 2011
Centenary of a computing giant
These days, we hear a lot about the achievements of Apple. I'm unlikely to complain about that, of course, because I've always been totally addicted to the products of Cupertino, from back at the time I wrote my first book about the Mac, in 1984, and even before then, at the pioneering epoch of the Apple II computer.
In the midst of all the talk about the marvelous creations of Steve Jobs, we must never forget, however, that the Big Daddy of computing has always remained a celebrated US corporation that made a name for itself by selling so-called "business machines" on an international scale.
In 2011, the company will be turning 100, which means that it was born in the same year as Tennessee Williams, Ronald Reagan and France's Georges Pompidou. I joined IBM in Sydney towards the end of 1957, and worked as a computer programmer using the Fortran language on a vacuum-tube machine called the IBM 650, whose central memory was housed on a revolving magnetically-coated drum.
The new IBM website designed to celebrate the centenary includes an interesting video on the second-generation transistorized computer that came next: the IBM 1401, seen here in an old marketing photo:
This was the machine I was programming (in a macro-assembler language called Autocoder) at the time I arrived in Paris, in 1962, and started to work at the European headquarters of IBM. Click the above photo to see the video concerning this machine, which shows various former IBMers of my generation.
These days, IBM has embarked upon a colossal computer challenge in the domain of artificial intelligence. Known as Watson (the name of the founder of IBM), this project aims to get a computer to perform better than human beings in the American TV game called Jeopardy! The system, based upon so-called massively-parallel probabilistic evidence-based architecture, incorporates a vast array of big boxes that have much the same external aspect as the units of an archaic IBM 1401… but you can be sure they do more things!
AFTERTHOUGHT: It's good, in a way, that IBM has been somewhat out of the limelight for many years, compared to companies such as Microsoft, Apple and Google. That has enabled IBM to move ahead quietly and constantly in a field such as artificial intelligence without too much media interference. But this situation is likely to change in a spectacular fashion as soon as Watson starts to bare its teeth… which is exactly what's happening at this very moment. Personally, I would not hesitate for a moment in declaring that a project such as Watson represents one of the greatest human challenges of all time: the invention of a deus ex machina that seems to be approaching the spirit of the famous IBM slogan.
I used to dream about that challenge back in the early '70s, when I was making a series of documentaries on this subject in the USA, for French TV, and writing my book on artificial intelligence.
And I still do, today, more than ever… particularly since scholars such as and Richard Dawkins and Steven Pinker have convinced me that we human beings are "merely" a special kind of machine, imbued with a strange property (not yet understood, of course) referred to as consciousness.
ANECDOTE: You might wonder why software engineers at Google and elsewhere have been scanning vast libraries of books of all kinds, and making them freely available to researchers. Are the corporations and engineers doing this because they want to offer more and more reading material, philanthropically, to old-timers such as you and me? Don't be naive! They're building those vast digital libraries for readers of a new kind: future generations of intelligent computers.
BREAKING NEWS: Stephen Wolfram, in his blog [display], seems to believe that IBM's Watson will win the forthcoming Jeopardy TV event. Moreover, he is encouraging IBM… even though their Watson is a competitor of his own approach: the so-called Wolfram-Alpha system.


The new IBM website designed to celebrate the centenary includes an interesting video on the second-generation transistorized computer that came next: the IBM 1401, seen here in an old marketing photo:

These days, IBM has embarked upon a colossal computer challenge in the domain of artificial intelligence. Known as Watson (the name of the founder of IBM), this project aims to get a computer to perform better than human beings in the American TV game called Jeopardy! The system, based upon so-called massively-parallel probabilistic evidence-based architecture, incorporates a vast array of big boxes that have much the same external aspect as the units of an archaic IBM 1401… but you can be sure they do more things!
AFTERTHOUGHT: It's good, in a way, that IBM has been somewhat out of the limelight for many years, compared to companies such as Microsoft, Apple and Google. That has enabled IBM to move ahead quietly and constantly in a field such as artificial intelligence without too much media interference. But this situation is likely to change in a spectacular fashion as soon as Watson starts to bare its teeth… which is exactly what's happening at this very moment. Personally, I would not hesitate for a moment in declaring that a project such as Watson represents one of the greatest human challenges of all time: the invention of a deus ex machina that seems to be approaching the spirit of the famous IBM slogan.


ANECDOTE: You might wonder why software engineers at Google and elsewhere have been scanning vast libraries of books of all kinds, and making them freely available to researchers. Are the corporations and engineers doing this because they want to offer more and more reading material, philanthropically, to old-timers such as you and me? Don't be naive! They're building those vast digital libraries for readers of a new kind: future generations of intelligent computers.
BREAKING NEWS: Stephen Wolfram, in his blog [display], seems to believe that IBM's Watson will win the forthcoming Jeopardy TV event. Moreover, he is encouraging IBM… even though their Watson is a competitor of his own approach: the so-called Wolfram-Alpha system.
Labels:
Apple,
artificial intelligence,
Google,
IBM
Subscribe to:
Posts (Atom)