Showing posts with label computing. Show all posts
Showing posts with label computing. Show all posts

Saturday, November 5, 2016

Mac-based psychotherapy experience

Over a year ago, in July 2015, inside my house at Gamone, I suffered a severe accident. After consuming a little too much tasty white wine in the warm weather, I fell down the stairs and bumped my head. As a doctor told me later, I could have easily killed myself. My son François assumed the harsh task of taking a train from Brittany down to my region, and then driving me and my dog back up to Brittany. There, I was housed and cared for, not only by my son, but also by his mother Christine and her companion Michel. It was rough work for them, for several months, since I wasn’t an easy patient. To cut a long story short, I finally survived, thanks to my family and several skilled medical specialists, who patched me up remarkably well.

Since I was accompanied to Brittany by my Macintosh computer, I tried as best I could to manipulate it… but some of my previous Internet skills had been bumped into the backwoods by my accident. Personally, I was totally convinced that my basic technical intelligence—that’s to say, my computer skills—was intact…. which corresponded to official medical evaluations of the patient. Little by little, through playing around with my faithful Macintosh, I was able to confirm, slowly but surely, that most of my former Internet functionality was indeed operational, although there were several technical zones in which I continued to behave a little shakily.


Over the months that followed, right up to and including today, I have been able to use my hardware (including a new iMac and several external disks) to confirm that I know what I’m doing in the Internet domain. It’s not an exaggeration to say that my friendly Mac world has been playing a significant role as a psychotherapy guide and yardstick. For example, since the accident, I found it impossible to link a powerful external disk to either of my two iMac computers. It was only yesterday that I played around with this external disk long enough to discover that it must have been screwed up a little by contacts with somebody’s PC environment, and that I would have to reformat and reinitialize it. I lost no time in doing so, and everything returned to normal… as it had been before my accident.

The most amazing thing of all is that I am now tackling various aspects of my familiar Macintosh world in a more rigorous manner than before my fall down the stairs. The therapy challenges have made me an even better Mac user than I used to be.

Wednesday, September 21, 2016

Flash is about to disappear


Once upon a time, Flash was the coolest kid on the block. I worked hard to master it. Most of my old websites of which I'm most proud today were created in Flash. In my wildest dreams, I never imagined for an instant that all these websites would disappear in the near future, simply because no navigator was prepared to display them.

I've just heard that, soon, neither Safari, Chrome nor Firefox will be prepared to display Flash websites.

Theoretically, I might be able to retrieve images from my Flash websites, before they disappear forever, and then rebuild them in HTML 5. I plan to examine this idea, but I'm not sure that it's both easy and worthwhile. Here, for example, are several typical French/English websites that are due to disappear: Master Bruno.

A similar calamity occurred with the Apple Pages tool, which subsided into a brain-damaged state a few years ago, losing many of its major capacities, because its owner wanted to propose a common denominator of talents that could be demonstrated, not only on an iMac, but also on an iPad or iPhone. Personally, I find that goal ridiculous. It's akin to taking a schoolboy and an Olympic athlete, and asking them to be trained together to run the hundred metres in much the same time. One gets pepped up with pills; the other gets castrated.

Sunday, March 6, 2016

Hard disks were born in 1956

Hard disks were created for the first time by IBM in 1956, half a century ago, at about the same time that I started to learn computer programming. The first model was called Ramac, and it was bulkier and heavier than a grand piano.


Personally, I was aware of the existence of such storage devices, but I never actually used one. Seeing the clumsy way in which they were transported by a tiny team of human workers, I imagine that these fragile devices were surely in a state of breakdown for much of their existence.

It's interesting to see that the familiar acronym "ram" existed already: random-access memory. This was an annoying term, because it gave the impression that the contents of the storage device were not in fact accessed in a strictly determined fashion, but a little like throwing a dice. That, of course, was not really true. The adjective "random" was an example of primordial IBM marketing buzz.

Friday, January 29, 2016

Place in North Sydney where I met up in 1957 with my first IBM computer

Towards the end of 1957, after my second year of studies in the Faculty of Science at the University of Sydney, my student friend Michael Arbib informed me of his recent encounter with the Australian branch of a US company named IBM. Michael had been offered a vacation job with this company, and he invited me to make a similar request. And that's how, in a brand-new North Sydney skyscraper (in 1957, the tallest building in the southern hemisphere), I came to meet up with the IBM 650 machine and a programming language called Fortran. I was therefore just over 17 years old when I started my life-long activities as a computer programmer.


The Miller Street building still exists today, looking small and old-fashioned in the vicinity of modern constructions.

Click to enlarge

At that time, to travel between the IBM offices and the central Sydney business zone, I used to take a tram across the bridge.

Click to enlarge

In that photo of a pair of tram lines on the eastern (Pacific Ocean) side of the bridge, we're looking south towards the main city. The tram on the right is moving northwards to the destination indicated below the driver's window: Frenchs Road in the nearby suburb of Willoughby, just beyond North Sydney. Further to the left, we catch a glimpse of the rear end of a tram moving towards the city, whose surprisingly low skyline can be seen further on. Those two tram lines were soon replaced—as Sydney residents now realize—by automobile lanes.

Today, as I sit here in the French countryside, in front of my computer, it's most moving for me to write a few lines about that distant corner of the world where I came into contact with an archaic IBM computer in 1957. In fact, I spent little time at that North Sydney address, because the company soon moved to a more convenient building in Palmer Street, Darlinghurst. It was there that I worked for much of the time (followed by a short period in the Lidcombe offices of IBM) up until my departure for the Old World at the start of 1962.

Saturday, June 23, 2012

Elusive Turing

Today is the centenary of the birth of Alan Turing [1912-1954]. He is represented in Manchester by this park-bench sculpture, which includes the cyanide-laced apple that killed the genius.


Google has celebrated the centenary by creating an ingenious doodle representing a Turing machine, but it takes some time and effort to figure out what it's supposed to do.


In my earlier blog post entitled Turing that unknown [display], I suggested that it's not easy to grasp what exactly Turing achieved. Fortunately, the US computer-science author Charles Petzold has offered us an excellent book, The Annotated Turing, which explains precisely the achievements of Turing.


While it's true that Turing's contribution to the British war effort at Bletchley Park was invaluable, his achievements in code-breaking were not the reason why we consider Turing today as the patriarch of computing. Likewise, while we appreciate Turing's suggestion about considering convincing man/machine conversations as a criterion for so-called artificial intelligence, this too was not really an all-important factor in Turing's claim to fame. So, why is Turing so greatly admired by computer scientists?

Well, his invention of the abstract concept of a so-called Turing machine (like the one in the Google doodle) threw light upon the limitations of algorithmic devices such as computers. More precisely, to use a horrible German term, Turing demonstrated that the Entscheidungsproblem cannot be solved. And what is this exotic beast? You might call it the "mission accomplished" problem. Like George W Bush with his war games, computers will remain forever incapable of determining beforehand whether or not a certain computing challenge can indeed be handled successfully. Turing taught us that the only way of knowing whether or not a computer can handle such-and-such a complex challenge is to set the machine into action and see whether or not it soon halts with a solution.

You might say that Turing proved that the proof of the computer pudding is in the computing.

Tuesday, March 27, 2012

Raspberry Pi basic computer

Normally, if all goes as planned, I'll be able to place an order tomorrow morning for a Raspberry Pi computer, for 40 euros. Click here to visit their website, to see what it's all about.


If I understand correctly, the development of this low-cost computer was masterminded by a fellow named Eben Upton and several colleagues at the computer laboratory of the university of Cambridge.


I couldn't agree more with Eben's belief that young hobbyist programmers need a gadget of this kind if they wish to become hackers... in the original noble sense of this term: skilled specialists capable of getting computers to perform amazing tricks.


Long ago, I remember hearing an American designate the primitive French 2-horsepower Citroën as "basic car". Well we might say that the Raspberry Pi is basic computer. When you pay your 40 euros, you get the bare minimum, with no frills. To get it to do interesting things, you're expected to add on all the necessary bells and whistles, which will inevitably involve creating your own software. And that's exactly what makes the Raspberry Pi an ideal gadget for bright individuals who are determined to master computer programming.

Monday, March 26, 2012

Calculating for dummies

Some of my readers might not get very far into this blog post, because calculating is not exactly an exciting subject, particularly when it's "for dummies". That's a pity, though, because the demonstration that I'm about to provide is really quite amazing. I'm going to show you how to obtain a relatively precise value of pi without having to perform any serious mathematics whatsoever.

I can hear a wag saying that you can merely look up the value of pi in Google! Fair enough, but I'm talking here about a method of actually calculating pi, from scratch, rather than simply looking up the value. The final calculation involves little more than a bit of counting followed by a multiplication operation. So, let's go.

To perform the operations I'm about to describe, you'll need a device that fires some kind of projectiles in such a way that you can clearly distinguish their points of impact. An ideal device, for example, would be a so-called air gun that fires birdshot pellets, known as BB slugs.


Having made this high-tech suggestion, let me point out immediately that you can perform the required operations using far more down-to-earth resources. For example, you might use some kind of sticky goo such as chewing gum, or children's putty.


The only requirement is that you must be able to determine precisely the point of impact of each projectile. Marbles or pebbles have to be ruled out because it's almost impossible to determine their points of impact when thrown at a target. So, let's suppose that you've obtained some kind of suitable device...

• Obtain a big square of white cardboard, the bigger the better, and place it flat on the ground beneath a tall tree. Make sure it doesn't move, maybe with the help of a couple of metal spikes.

• Armed with your airgun, or whatever, and a good supply of projectiles, climb up into the tree, high above the square of white cardboard... which will be used as your target. [I forgot to point out that you should probably let your neighbors know beforehand that you're conducting a scientific experiment in computing... otherwise they might become unnecessarily alarmed.]

• Now, here's the essential part of the calculation procedure. You're expected to fire projectiles (slugs, chewing gum, goo, whatever) in the vague general direction of the square of white cardboard down on the ground. Above all, you have to fire at the cardboard in a totally random fashion, without ever aiming deliberately at any particular region of the square. In other words, your projectiles are expected to produce impacts that are scattered all over the cardboard in a completely random fashion. Indeed, if ever you aimed carefully, and you were such a good marksman that all your projectiles hit the middle of the cardboard, then the method I'm describing would not work at all.

• You're expected to carry on bombarding the target with projectiles for as long as possible, until the cardboard is completely covered in impacts.

• When you've produced a huge number of randomly-located impacts (let's say, to be generous, a few tens of thousands), climb down out of the tree and examine meticulously the bombarded square of cardboard. You will have understood by now that my method of "calculating for dummies" is a little weird. Call it a thought experiment, if you prefer.

• Using a corner of the cardboard as the center, draw a circle whose radius is equal to the length of a side of the square. Your big square of cardboard should look something like this:


• In the above representation, we've introduced a color code, to simplify our explanations. Points of impact inside the quadrant of the circle are indicated in red, and the others in blue.

• Start out by counting the number of red impacts, inside the quadrant, which we shall designate as Q. Then count the total number of impacts on the cardboard square, red + blue, which we shall designate as T.

• Divide Q by T, and multiply the result by 4. This will give you a value of pi.

It's easy to understand why this counting procedure should provide us with the value of pi. Consider the ratio of the area of the quadrant and that of the square. Elementary geometry tells us that this ratio is pi divided by 4. And, provided the impacts are scattered randomly over the entire square, then we can see intuitively that Q divided by T should be a good approximation to the value of this same ratio. To put it in simple terms, the quantity of impacts in any particular zone indicates, as it were, the relative area of that zone.


This approach to calculations was named in honor of one of the world's most prestigious gambling temples: the Monte Carlo casino in Monaco, on the French Riviera. When you use the Monte Carlo approach on a computer, you no longer need an airgun and BB slugs to produce your set of arbitrary points. You simply use an application capable of generating random numbers.

The Monte Carlo method of problem solving was invented in 1947 by John von Neumann and two of his colleagues, Stan Ulam and Nick Metropolis, at the Los Alamos National Laboratory in New Mexico. A small group of brilliant scientists, many of whom had recently arrived in the USA, had come together with the intention of designing the world's first full-fledged electronic computer, named Maniac, to be used primarily as a development tool for the hydrogen bomb.




When I started work as a computer programmer with IBM Australia in 1957, the Monte Carlo method had reached the zenith of its popularity as an almost magical problem-solving approach, which fascinated all of us. Today, over half a century later, Monte Carlo computational algorithms are still in widespread use in many simulation contexts.


The Monte Carlo method is entitled to an entire chapter in the middle of George Dyson's interesting and instructive history of computing, Turing's Cathedral.

Monday, January 2, 2012

Computer crash

This morning, my Macintosh computer refused to start up. All I've got on the screen was a gray apple icon and a little revolving circle of dashes. Clearly, my Intel iMac (purchased in April 2010) had crashed. The people at the Fnac store in Valence had kindly warned me, 20 months ago, that Apple products are now manufactured in China, and that they seem to break down more often and sooner than before. So, they advised me to take out insurance, to cover repairs. I'll be depositing my machine with them tomorrow morning, and I should have it back home in a fortnight. And repairs will be covered by the insurance.

Meanwhile, I'm using my old iMac, purchased in 2005, which still runs perfectly (using the Leopard system). This is the machine I've been using already, regularly, to run the precious FreeHand tool for genealogical charts, which does not exist for the latest Mac system.

I've already hooked up the external disk that was being used for automatic daily backups on the iMac that crashed, and I'm relieved to see that everything is there, intact. Normally, on my Time Machine disk, I should have copies of the most recent stuff I was writing (about the famous thatched house in Blackbird Street) just before I went to bed last night. So, I won't have lost anything at all... apart from time spent driving to Valence and back. And it's unsettling to have to move back to an older computer system for a while.

Friday, September 16, 2011

No problems



It's easy if you try.

Friday, March 25, 2011

In front of what?

Friends see that I follow current affairs on the web (including events in my native land). Then they hear me raving on about my blogging, my Internet-assisted genealogical research, my use of word processing for creative writing and, now, my intense involvement with the complex domain of Macintosh and iPad programming. Inevitably, they pop the obvious question: How many hours a day do you spend in front of your computer screen? This question annoys me, because I can see their brains ticking over and getting ready to subtract my answer from 24, obtaining X, enabling them to conclude: This poor guy only lives in the real world for X short hours a day!

Their question is indeed poorly worded. No doubt poorly conceived. A more significant question would be: How many hours a day do you spend in front of your brain, your reflexions, your intelligence, your background, your culture, your identity, your ambitions, your creative activities, your intellectual projects, your passions, your destiny, etc…? And my answer would be something in the vicinity of 17 to 18. In other words, I have little spare time to waste, to be bored.

Back in Paris, when I worked as a technical writer in the high-powered ILOG software company (now a part of IBM), my fellow-workers used to laugh about a cleaning lady who, before dusting down a computer screen, would always say to the user, politely: "Excuse me, give me half a minute to clean your telly." Her use of the term "telly" gave us the impression that she looked upon our group of ILOG software engineers (who often worked late into the evening) as a joyous throng of guys and gals who seemed to be paid to spend hours on end watching mysterious TV shows, in languages that they alone could comprehend. Well, she wasn't really wrong. Except that purists would have pointed out that our screens didn't capture and display the heavenly signals designated as TV, but something a little different, emanating from within our "tellies". We were watching and appreciating shows that we ourselves had just produced. But none of us had the courage (nor the desire, for that matter) to attempt to explain that situation to the cleaning lady.

In a similar sense, I wonder if there's any point in trying to explain to friends, today, that the vast time I seem to spend sitting in front of a computer screen is not simply "time spent sitting in front of a computer screen". It's much more than that. As I suggested earlier on, I'm seated, for much of the time, in front of… myself! Introspection, maybe, or even narcissism. I would speak rather of computer-assisted cogitations or meditation. Much more, in any case, than dumb screen-watching.

To my mind, in terms of wasting time, there are worse things than a computer screen to be seated in front of. For example, the steering wheel of an automobile. Or fellow passengers in public transport (trains, buses, trams, etc). Sitting in front of a TV screen in certain English-speaking societies (which I hardly need to name), or their media in general, can be a most effective way of plowing mindlessly through time. Personally, I would not willingly swap the least amount of computer screen-watching for, say, time spent waiting to be served in a dull restaurant offering poor-quality food. But the deal would be off, of course, if I happened to be dining on a warm evening, say, in Arles with a dear Provençal friend [display]. It's not so much a question of where you're sitting, but rather a matter of the quality of the entity in front of which you're seated!

I don't deny that spending hours in front of a computer screen might, in certain circumstances, be thought of as a waste of time. (But who am I to judge?) Maybe that's why I detest all kinds of games (including bridge evenings with suburban neighbors… who don't exist here, fortunately, at Choranche). On my Macintosh, there has never been anything that looks remotely like a video game. I hate all that fake stuff. On the other hand, it's fact that I can "waste" precious time gazing up at the Cournouze, or down into the eyes of my dogs. As I said, it's not so much where you decide to sit down, but rather what you want to watch. And I would be a liar if I were to suggest that I don't like spending a lot of time watching what gives on the screen of my faithful Macintosh. I hasten to add that I'm also very fond of my splendid TV screen, and vaguely concerned (when it's absolutely necessary, which is rare) by the relatively insipid screens of my iPad and iPhone.

Sunday, September 19, 2010

Laptops to lead Aussie kids "out of poverty"

I was surprised by a recent article in the Australian press with a shock title: "Looking to laptops to lead Doomadgee children out of poverty". A photo showed a group of kids, mostly Aboriginal, holding up their machines for a corny staged shot.

There were several reasons for my surprise:

• It shocks me to see a newspaper headline stating explicitly that certain Aussie kids are apparently living in poverty. That's a strong word, which outside observers don't generally associate with citizens of Australia.

• The notion that laptops might be capable of "leading children out of poverty" is outlandish, and hard to believe.

• I'm familiar with the project entitled One Laptop Per Child, conceived by the US computing academic and visionary Nicholas Negroponte. I wrote a blog article on this subject, entitled Fabulous educational project [display], back in October 2007. I had always imagined that the children to be assisted by Negroponte's wonderful mission belonged to so-called developing nations. It's an almost unpleasant surprise to find scores of Australian children, throughout the land, included in the bunch of recipients of these low-cost laptops.

Readers should visit the website of the excellent Australian organization handling this project. You'll be able to reach your own conclusions concerning this project in Australia… and I'm aware that you won't necessarily react negatively, as I have done. I'm not suggesting for a moment that there's anything wrong with this plan to hand out cheap laptops to kids in Australia. I'm merely pointing out that it's a charitable enterprise, initially designed for Third-World inhabitants, and that it's weird to see my native land falling back upon international US-inspired charity in order to solve internal educational problems.

The spirit of such an initiative is surely the celebrated Chinese proverb: "Give a man a fish and you feed him for a day. Teach a man to fish and you feed him for a lifetime." What shocks me, I guess, is that it's not directly the Australian ministry in charge of education—assisted, maybe, by philanthropists and industry—who's teaching these under-privileged kids to "fish" with computers (and the Internet).

ADDENDUM: In my initial post on this subject, I suggested that, in Third-World villages in places such as Africa, electrical power for the laptops could be generated by cyclists. I'm happy to see that there's now a device on the market to meet this challenge.

Admittedly, if poverty has reached the point at which, due to malnutrition, it's impossible to find a sturdy cyclist, then we're stuck with a real problem. I must talk with Lance Armstrong, one of these days, to see if he has any worthwhile ideas on this question...

Monday, May 31, 2010

Doing things on a computer

Using my iMac to communicate through blogs is an interesting activity. In associated domains, I'm fond of Twitter, but I see it as subservient to blogging, or simply as a convenient means of pointing to exceptional things on the web. On the other hand, I get bored by Tweeters such as Nassim Nicholas Taleb (the Black Swan guy) who stretch over backwards in attempts to impress us with 140-character aphorisms. As for Facebook, I find it totally uninteresting, if not vulgar.

I've become accustomed to using my iMac in two or three other ways. Above all, I devote a lot of energy to writing, using the excellent Pages tool from Apple. I've also built various websites, mainly for fun. A typical example is this short presentation of the medieval hermit Bruno [1030-1101] who inspired the foundation of the order of Chartreux monks:


The following archaic example is an online sales demo that I produced for a competition. I rarely show it to anybody these days, because it incorporates unpleasant audio clicks, which I put in deliberately (a decade ago, I thought that was smart). I've lost the source code, otherwise I would eliminate these annoying sounds:


To build these websites, I've been using a tool named Flash, now marketed by Adobe. Long ago, before getting carried away by Flash, I used to create conventional HTML websites by means of a dull tool named Dreamweaver, also marketed now by Adobe. Here's a satirical example, designed in pure HTML, which dates from 2003:


Today, alas, a big problem has arisen concerning Flash: Steve Jobs doesn't like it, and he prohibits it on both the iPhone and the iPad!


Click the above photo to access an article entitled Thoughts on Flash in which the CEO of Apple makes it clear why there won't be any Flash stuff turning up on their iPad device.

Let's suppose that, contrary to my article of February 2010 entitled Second look at iPad weaknesses [display], I were to become concerned by, or even interested in, this new device… primarily because of its potential in the domain of electronic books. If this shift in attitude were to occur (as I think it will), then what should I do about my longstanding commitment to Flash? The answer to that question reflects the fact that "longstanding commitments" simply don't exist in the computing domain, where things are evolving constantly, and we have to accept all kinds of changes, including those that look at first like disturbances. So, obviously, I should abandon Flash… But what should I put in its place?

Steve Jobs provides us with a serious answer, maybe the only serious answer: HTML5, that's to say, the upgraded variety of HTML that the World Wide Web Consortium is currently examining. Apparently, there are significant parts of this future standard that are already operational, as long as you build your sites by means of a "good editor" (such as the latest version of Dreamweaver), and read them with a "good browser" (such as Safari). And of course, any vague feeling you might have that the computing world is becoming more-and-more Apple-dominated is just pure coincidence…

But that's not all. I wrote my first computer programs in 1958, when I was working with IBM in Sydney. Today, I'm still fascinated by computer programming, but purely as a hobbyist. If this new beast known as the iPad is here to stay (as would appear to be the case, at least for a while), then I've decided that it might be a good idea to learn how to write programs for it. In that way, I would surely feel less frustrated about abandoning Flash, whose scripting was a kind of Canada Dry ersatz for real programming.

Thursday, May 20, 2010

Optical illusion

This amazing little video was produced by pointing a camera at a real-world scene composed of four tracks that guide the movements of rolling marbles:



As soon as the table is turned around, providing you with a view of the scene from the opposite side, you can immediately see the tricks behind the illusion. The tracks simply slope downward towards their intersection, and they're designed in such a way that, when viewed from a certain angle, they seem to slope upward. The creator of this excellent illusion is Koukichi Sugihara of the Meiji Institute for the Advanced Study of Mathematical Sciences, Japan.

Friday, May 7, 2010

Photos from my time at Cap France

My friend Yves Tallineau has just sent me a couple of photos dating from 1969, when we were working together in Paris in a software company named Cap France. More precisely, we were operating an in-house training department, located in the Avenue du Général Foy (near the St-Augustin church), aimed at teaching groups of fellow employees of Cap France how to program the IBM 370 computer.

In those days, programmers usually told computers what was to be done by recording programming instructions in the form of punched cards. These were produced manually (generally by the programmers themselves) on IBM card-punch machines. Here we see one of our trainees punching cards for her program, which was probably developed in the Cobol language.

As an extra task, Yves and I once produced a couple of audiovisual presentations concerning the company's two software products, called Autoflow and Sysif. In the following photo, I'm using scissors and adhesive tape to edit an audio tape on a Revox tape recorder.

In fact, at that time, I was attending evening classes organized by Pierre Schaeffer in the musique concrète studios of the research service of the French Broadcasting System. That explains how I had become proficient in audio tape editing.

A few months after the time at which this photo was taken (towards the end of 1969), I decided to leave Cap France and accept an offer to work as a salaried engineer with Schaeffer. As a result of that change in my professional existence, I soon became involved in computer music, television production, artificial intelligence and writing. But that's another long story.

Monday, February 1, 2010

Second look at iPad weaknesses

Concerning Apple's iPad, my recent article entitled Latest creation [display] was inadvertently but grossly over-enthusiastic. Preoccupied by the amusing phenomenon of Steve Jobs presenting his latest Apple baby, I did not even attempt to say what I thought personally about this new device.

Well, having looked into this affair a little more closely, let me now say that I fear the iPad will be a total marketing flop. Why? For the simple reason that I can imagine few reasons why anybody would ever want to use such a gadget.

For a moment, I had imagined the iPad as a blown-up version of the iPhone. This, of course, was poor thinking on my part: I was forgetting that you can't make phone calls with an iPad. Located midway between an iPod touch (an iTunes reader and portable game player) and a full-blown computer such as a MacBook, the iPad might be thought of as combining the advantages of both. Well, I now believe that this is not in fact the case. In trying to be a little bit of a mobile device, and a little bit of a true computer, the iPad turns out to be neither!

A particular aspect of the iPad shocks me greatly. Like the iPhone, it won't display Flash websites. From that point of view, the iPad reminds me of a French novel entitled La disparition, written by Georges Perec [1936-1982], which doesn't contain a single instance of the letter "e", which is normally the most widely-occurring vowel in the French language. In the same way that I wouldn't rush to buy a gimmick novel that doesn't contain the letter "e", I wouldn't rush to purchase a gimmick Internet machine that doesn't offer Flash.

And why exactly is it so important for me (as for millions of other web-users throughout the world) to have a computer that can handle Flash? Let's start with this blog. Normally, in the right-hand column, there are various small banners pointing to my associated websites. Well, if your computer can't read Flash stuff, you simply won't see any of these links. Over the last few years, I've built a score of websites on all kinds of subjects ranging from my personal genealogy through to cultural stuff about the medieval hermit Bruno who's considered today as the founder of the Chartreux order of monks. Well, without Flash, you won't be able to examine the slightest element of all this work of mine. And a corollary of this antiquated state of affairs is that I wouldn't be able to use an iPad to modify anything whatsoever in my web creations. So, to my mind, the iPad gadget is strictly for exotic individuals with specialized computing needs such as Beefeaters in the Tower of London, Druids, Mormons, six-day bike-riders, Creationists and other yokels.

Having said this, I hasten to add that, if anybody were to send me an iPad as a gift, I would be immensely happy to receive it. I would pass it on immediately to the neighboring kids in Châtelus, on the other side of the Bourne, who love to play games. As for me, I'm too old for that. Besides, in all my life, I've never, at any moment, been an inveterate games-player. For me, there has always been only one big game, with fascinating and mysterious rules, called Life. Nothing to do with iLife.

POST-SCRIPTUM: Somebody extracted all the positive words and expressions employed by Steve Jobs and other Apple executives during the recent presentation of the iPad, and strung them all together in the following video:



It's hardly reassuring to find that a new product needs such excessive verbal icing sugar.

Thursday, October 29, 2009

Two birthdays

Today's Google banner in France celebrates the 50th birthday of the comic-strip characters Astérix and Obélix, who appeared for the first time in an issue of the Pilote magazine dated 29 October 1959.

René Goscinny [1926-1977] created the humorous scenarios while Albert Uderzo [born in 1927] did the drawings.

We also celebrate today the 40th anniversary of the first message sent from one computer to another through a primitive network, which finally blossomed into the Internet. At the University of California in Los Angeles, on 29 October 1969, professor Leonard Kleinrock and a student programmer, Charley Kline, attempted to login to a remote computer located at the Stanford Research Institute in Menlo Park. For this to happen, the second computer needed to receive the five letters LOGIN from the first computer, but the system crashed after the reception of only the first two letters. So, the world's first net message turned out to be LO. Here's the story, told by Kleinrock himself:



I'm often surprised to think that, when I visited the USA in the early '70s to shoot documentaries about so-called artificial intelligence for French TV, I imagined that we were already living in a fascinating computer world. In fact, the big surprises—personal computers and the Internet—were still quite far away in the future. I realize now that the computing context I discovered and filmed—characterized by PDP hardware, LISP software and often overblown evaluations of accomplishments and promises—was relatively primitive compared with today's world.

Monday, October 26, 2009

Aggressive Apple ads

I would imagine that Microsoft has had enough time and experience by now to get its act together at an operating systems level, in which case Windows 7 should normally be one of the finest and friendliest PC products that could possibly exist. Maybe we'll even discover that it has a nicer look and feel than Leopard and Snow Leopard on the Mac. Who knows? Computing is such an awesome domain that anything could happen. In any case, it will be interesting for certain Mac users (maybe including myself), in the near future, to take a look at Windows 7 in a Boot Camp environment on an iMac, to see if it's a good solution for certain kinds of work. For example, I still dream about being able to use a powerful word processor such as Adobe FrameMaker— which no longer exists on the Mac—for my writing, particularly in the genealogical domain.

Meanwhile, Apple has reacted to the arrival of Windows 7 by an aggressive publicity campaign intended to tell PC users that, instead of upgrading to Windows 7, they should purchase a Mac. Click here to see their latest set of ads.

If Apple has gone vicious (to the extent of frankly aiming to ridicule Microsoft), this is no doubt because everybody realizes that Windows 7 could in fact turn out to be a great operating system. So, Apple is in a now or never situation. In any case, it will be interesting to see if there's a massive move to Macs.

In this eternal PC/Mac conflict (where, thankfully, no soldiers or civilians appear to be getting killed), there's a gigantic gorilla in the living room, which people often refrain from mentioning, as if the beast were not really there. Delegates from both camps talk endlessly about the intrinsic merits of their system, and the weaknesses of the opposition. But the BIG reason why an individual hesitates before moving, say, from a PC to a Mac is the obvious fact that he/she has purchased a lot of software tools, and that it would be painful to have to replace all that stuff.

If you're a home-owner thinking about moving, say, from Choranche to Bergues, you can normally sell your old place at Choranche and look around for equivalent accommodation in the charming countryside in the vicinity of Bergues, or maybe (for adepts of nightlife) within the exciting township itself.

Sadly, in the case of moving from a PC to a Mac, there's no obvious way of selling your old software and using the financial resources to purchase new Mac stuff. It's a variation on that old story—which I've been telling in one way or another for the last four decades—about the specificity of information: the fact that you can give it away to friends, but you still keep it. In harsh economic terms, there's no way in the world that you can sell old software to buy new stuff. It's not even a biblical matter of putting new wine into old bottles. The simple fact is that the old software is obsolete: antiquated worthless shit. In the world of information and computers, before people can move readily from A to B, a revised science of economics needs to emerge.

Friday, July 24, 2009

Thanks for the memory

A few years ago, the price of hard-disk drives started to drop, while their capacities jumped exponentially. That's why I've got three little beasts like this sitting behind my iMac, for random backups (alongside my Time Machine backup, which runs permanently):

Like most computer users, I simply greeted this hardware affair with pleasure, without ever bothering to think about the reasons for the hugely positive evolution of the economics of hard-disk backups. I imagined naively that manufacturers had simply improved their production methods in such a way that prices could be slashed while the drives themselves could have increased storage capacity. In fact, the explanations are considerably more complicated than that. These days, customers have indeed been reaping vast benefits from the development and commercial availability of entirely new storage technologies... and things are still getting better all the time.

In a recent issue of Scientific American (the only paper publication to which I subscribe), there's a splendid article on this subject written by an English physicist named Stuart Parkin, who has been working in California on the invention of astounding new storage technologies. [Click the photo to access the Wikipedia article on this man.] In the context of my blog, I cannot of course attempt to explain the nature of the complex technologies at the origin of our low-cost high-capacity hard-disk drives. But I can't resist the temptation of quoting an amazing item of information provided by Parkin in his Scientific American article:

Today the collective storage capacity of all hard-disk drives manufactured in one month exceeds 200 exabytes, or 2 x 10-to-the-power-20 bytes [I don't know how to display an exponent in Blogger]—enough to store all the extant analog data in the world, that is, all the data on paper, film and videotape.

You might ask: Who is actually purchasing this astronomical quantity of storage potential? And what are all these storage devices being used for? Those are good questions, which I'm incapable of answering. It can't all be Google...

Thursday, April 2, 2009

Steel, nutwood and stone

I've put a protective coat of anti-rust product on the steel carcass of my recently-constructed iDesk, and polished the walnut slabs with lovely-smelling wax.

My neighbor Bob was impressed by my furniture design, but he considers that the wheels detract from the "nobility" of the steel and the walnut. When I talked to the wood supplier about the idea of marketing my iDesk model, he said that customers ask him to build computer desks with a means of hiding cables. That request surprises me, for modern wifi computers don't have too many dangling cables.

Saturday, March 28, 2009

Steelnut desks for computer users

Click the image on the left to see a larger version of the Steelnut ad.

I'm proud of my iDesk line of Steelnut furniture, "designed and manufactured by skilled Dauphiné craftsmen".

The small iDesk shown in the poster is my recently-designed Blogger model (of which I took delivery only this morning). At Gamone, my main iMac sits on a much bigger iDesk: the original Webmaster model. I also designed a lightweight iDesk that I refer to as the Browser model, which I use as a bedside table.

Steelnut furniture is supplied in an unfinished form. That is, the steel tubes are fresh out of the workshop, and need to be treated with some kind of metal product, while the walnut slabs should be polished with wax. Steelnut products are made to order, and prices are very reasonable. Once an order is placed with the craftsmen, an iDesk is manufactured within about a week.

You will have guessed that Steelnut is a figment of my imagination. Its products exist only in my house at Gamone. But I'm convinced that many computer users might be interested in this low-cost approach to heavy desks and tables of a rugged and rigid nature.

POST SCRIPTUM: Webmaster iDesk in a working environment: