Google doodle honors computing pioneer

December 9, 2013 • 1:56 pm

Today’s Google Doodle honors Grace Hopper, pioneering computer programmer who would have been 107 today had she lived (she died in 1992 at age 86). The Doodle is animated to show the big, bulky computer calculating her age:

Screen shot 2013-12-09 at 7.18.48 AMAs Engadget notes:

Prior to her work, computers were considered to be glorified calculators and were programmed with binary machine code, which kept the field limited to specialists. After working on computer tech used on the Manhattan Project during World War II, she developed the A-O system for the UNIVAC 1 in 1951, which is considered to be the first-ever computer compiler. That eventually formed the basis for COBOL, the first widely used English-like compiler that laid the foundation for most computer languages today. Hopper did further research for the Navy until the age of 79 (when she retired with the rank of rear admiral) and worked for DEC until she passed away in 1992 at the age of 85.

Now I’m not a computer geek, and haven’t written a line of code in my life, but I suspect many of you will appreciate her achievements.

Here she is on Letterman’s show close to age 80:

And Wikipedia notes that

“At the time of her retirement, she was the oldest active-duty commissioned officer [a rear admiral] in the United States Navy (79 years, eight months and five days), and aboard the oldest commissioned ship in the United States Navy (188 years, nine months and 23 days)” . . . The U.S. Navy destroyer USS Hopper (DDG-70) is named for her, as was the Cray XE6 “Hopper” supercomputer at NERSC.

480px-Commodore_Grace_M._Hopper,_USN_(covered)
Rear Admiral Hopper

35 thoughts on “Google doodle honors computing pioneer

  1. I met Grace Hopper once in 1969, when I was fresh out of college and working at Univac. She was a pioneer. Many of you probably know that another woman, Ada Lovelace, was an even earlier pioneer of computing.

  2. You have to appreciate the mental fortitude these women must’ve had to be a woman doing this work back then & to be in the navy as in Hopper’s case!

  3. Note the moth coming out at the end! This is allegedly the origin of the term ‘bug’ in computerese as she felt that the lepidopteran was the source of an error. ‘Bug’? Hopper was smart but she was no taxonomist!

    1. It’s been pointed out that people as far back as Edison were talking about bugs. I think Hopper was just making a joke: look, an actual insect!

  4. Nice post.

    I used to work on a UNIVAC computer too, back in the day, then changed jobs to work on IBM. Did the COBOL and FORTRAN and all that too, and still chuckle when non-techies say ‘Cobalt’ instead of COBOL.

    1. Never heard Cobalt-LOL! Wasn ‘t COBOL a clunky language?! Is it used at all anymore? I remember thinking during the Y2K pseudo-scare what a mess of code it was to clean up.

      1. You’d be amazed at how many lines of COBOL are still being used. Indeed, there’s an excellent chance that your paycheck was calculated at least in part with a COBOL program. Same with the interest your bank pays you.

        I seem to recall that there’re certain sciences where COBOL is still the preferred language because of both speed and available libraries.

        Cheers,

        b&

      2. COBOL was very clunky and wordy. I’d guess there might be old code still in use, somewhere, somehow, as some major systems were developed using it. Maybe someone else could clarify this.

        The Y2K thing was a mess, as back in the old days, IT departments had to literally scrimp on bits and bytes, and dates were often encoded as YYMMDD, instead of YYYYMMDD. Oh the woe to follow!

        1. The Canada Revenue Agency’s main IT systems are all written in COBOL and that won’t change any time soon. Trying to modernize huge legacy systems is a real pain.

          1. There’s also the question of why one would attempt the project in the first place. If it’s just newness for the sake of newness, you’re guaranteed to have a disaster on your hands.

            There’s nothing worng with COBOL. Sure, if you were going to start something from scratch, you’d use something more modern, but the same could be said with anything old still in use, from Roman aqueducts to calendars to monumental buildings.

            And it’s not like tax codes have changed substantially in the past few decades; they’ve only been tweaked a bit here and there — and always incrementally.

            There’s no performance problem with these sorts of applications that can’t be solved much easier and cheaper by throwing more hardware at it.

            There’s no developer problem; anybody professional enough to write code for such a system in the first place is professional enough to write good code in any language, even ones she’s not yet familiar with; give her some books and (most especially) a fully-functioning playground to screw up and a few weeks to make it from “Hello, world!” to a simple Web browser, then pair her with somebody who knows the particular codebase for her first several assignments, and all is well.

            Can’t think of any other valid reasons for “upgrading.”

            Cheers,

            b&

          2. Yes, but it’s still the same class of problem. It’s not like you need calculus to figure out your taxes, for example, even if it might seem like it….

            b&

      3. I know a guy who has made a small fortune by knowing how to code Cobol. He travels around the country, putting out fires where companies use old legacy Cobol programs. That said, technology has passed Cobol by.

        I’ve written hundreds of thousands of lines of code in various computer languages, from assembly, to Fortran, to Lisp, to C++. IMHO, nothing surpasses C for clarity, simplicity, flexibility, and efficiency.

        1. It’s entirely a matter of the right tool for the right job. You most emphatically don’t want to be using C to query relational databases, and it really, really sucks for Web applications, too. You probably (but not absolutely) don’t want to be using C for embedded 8-bit processors, either.

          But operating systems, device drivers, and the like? C (and its close relations) still rules the roost in that domain, and rightly so.

          Cheers,

          b&

          1. Obviously, if you’re building a web app you won’t use C. But underneath the hood, where the rubber meets the road (to stretch a metaphor), you’ll find C or a variant. It’s the clearest middle ground between what a coder thinks and what a computer does.

          2. Its the clearest middle ground between what a coder thinks and what a computer does.

            Eh, not at all.

            When you’re querying a database, you’re thinking in set theory. 90%+ of the SQL I write uses a subset of the language that is not Turing-complete. Indeed, using those bits of the language that make it Turing-complete is almost always a sign of poorly-written (or, at least, generally sub-optimal) code.

            And what the computers actually do is raise and lower voltages on pieces of silicon. C is about as far removed from that these days as SQL is from C. First, there’re all the libraries you’re using, standard or otherwise. Next, your code gets run through the pre-processor. The output of that gets fed to the assembler which writes what’s called machine language. That’s something of an anachronism, though, because x86 chips with their CISC architecture internally take those single instructions and translate — that is, compile — them into much more complex and optimized instructions internally before finally fiddling with the voltages. (Though, of course, all of these processes are, themselves, ultimately nothing more than exercises in voltage-fiddling.)

            When you create a pointer in C, you have no clue whatsoever what physical transistors on the chip will be responsible for storing the values referenced by said pointer. Hell, in practice, you generally don’t even know (nor care) if that’ll be stored in RAM, in any of three or four levels of cache, on disk in virtual memory, in the disk’s own cache, or all of the above — and that’s all stuff that you could, at least on some operating systems, theoretically maybe have some level of control over when using C.

            Again, don’t get me worng: C is an awesome language for those significant domains where it truly shines. But, these days, those domains, as important as they are, are a relatively small (though essential) niche in the overall scheme of things, and it’s solidly in the middle of the conceptual abstraction in modern computing.

            Cheers,

            b&

        2. I remember having to run LISP programs on paper tape back in the early 70s…Algol and PL1 were great,
          but the one course I was forced to take in COBOL was a nightmare, maybe partly because our text was simply the IBM manual. Clunk all the way down…

          1. I wrote a LISP interpreter in assembly for the PDP11 back in the 70s. Backup was paper tape. Those were the days. In the 90s, I used LISP a lot on LISP Machine workstations and the Connection Machine parallel supercomputer.

            Regarding C, it’s closer to the hardware than most other languages, and certainly closer than something like SQL. If the coder has a good model of the target hardware it’s possible to do optimizations that aren’t possible with higher level languages. This is especially important when efficiency is a prime motivation, such as applications for massively parallel supercomputers (a field I worked in for years).

            If you use SQL it’s likely that the compiler is written in C or a variant.

          2. “Efficiency” is a tricky term.

            In the overwhelming cases of programming jobs, a single line of code that replaces an hundred lines (ultimately) of assembler, is powerfully more efficient than the assembler version, even if the assembler runs an hundred times faster.

            The efficiency being gained is programmer efficiency, which is generally much, much, much more expensive than computer efficiency these days.

            Of course, a big part of the reason why that’s the case is because a small fraction of the number of programmers have improved the efficiency of the higher-level languages (and the hardware) to the point that the scales have tipped in this way, and a non-trivial portion of that code was (wisely) written in C or one of its close cousins.

            Far too many programmers optimize far too soon. Code should be written in the simplest, clearest, shortest way possible — and with not your own super-clever brain in mind, but rather the brain of the two-banana code monkey who’ll be revising it three years down the road. (And, incidentally, writing code like that and writing it well is much harder than writing “sophisticated” code.) If that’s too slow, see if it’s practical to throw more hardware at it; if so, spending a few thousand dollars on a server upgrade will be cheaper than a week of developer-hours to optimize the code — especially since said hardware will support similar applications as well, and they won’t need all those extra hours to develop with special optimizations, either. If not, then and only then you start down the dance of picking off the low-hanging fruit, then the slowest bits that get called the most, and so on.

            (Of course, if you have the luxury of actually engineering your code — a luxury that damned few shops have — then you’ll know before you start writing the code which parts will need optimizing. But few managers in the real world have the patience for that type of development model; they want it NOW! NOW! NOW! NOW! NOW! NOW! NOW! NOW! NOW! and would, for whatever incomprehensible reason, rather shove something out the door now and fix whatever breaks than get it right in the first place.)

            Cheers,

            b&

      4. Clunky? Not at all for certain types of problems. The Y2K scare was not the fault of COBOL. It was because space to store data was expensive. So programmers did things like drop the century portion of a year (e.g., 1961 => 61) when saving the data and assuming the centery when reading it. They assumed (wrongly) that the programs and data they were responsible for would no longer be used when the century changed from 1900 to 2000. A few decades later we’re faced with a similar issue involving the UNIX timestamp value which wraps around to zero in 2039 on systems using a 32-bit timestamp. And such systems will definitely still be in use when that date rolls around. The question is whether any of them will be performing critical operations that depend on those timestamps.

        1. The Unix epoch thing will be a non-event. Most code makes library calls for date functions, and those libraries have already been updated with 64-bit time_t implementations. No competent developer has used 32-bit time_t for a roll-your-own solution in the past fifteen years. Damned few embedded systems have lifetimes that long; the hardware they’re embedded in will be replaced long before then. And anything big and long-lived was either fixed as part of Y2K or they already know it’ll fail and have planned a fix into the product upgrade lifecycle. A decade and a half after Y2K, you can bet that they’ve already long since fixed it — especially since, for example, 30-year mortgages would already thrown hissy fits five years ago.

          Instead, it’ll be a wild-n-wacky news item, a few geeks will have parties to watch the odometer tick over, and that’ll be it.

          Cheers,

          b&

      5. My sister’s first job after being a mum from school was maintaining COBOL code for a wholesale company in the mid-1990s. It didn’t look like it was going out of fashion then.
        Quick check – is there an open-source COBOL compiler? You have a choice of OpenCOBOL, GNU COBOL and Tiny COBOL and before you even get to the Wikipedia articles.

  5. “Navy destroyer USS Hopper (DDG-70) is named for her, as was the Cray XE6 “Hopper” supercomputer at NERSC”

    ’nuff said 😀

  6. In the video she mentioned living in Arlington, VA. I didn’t know it at the time, but, for about 5 years, I lived in the same apartment complex she did, and there is now a little park named after her, Grace Murray Hopper Park, in front of the apartment building she had lived in. Quite a few military officers lived in that complex because of its proximity to the Pentagon.

  7. Grace Hopper was shown on a 60 Minutes interview at a time when appearing on 60 Minutes meant you were dead within weeks (60 Minutes had a run of these about then). Then, later, they’d re-show the Grace Hopper interview (they did this 3 or 4 times). The wife said, “Are they _still_ trying to kill her off?”

  8. As a survivor of Y2K I can tell you all that modifying the COBOL code was a breeze compared to the (fortunately few) assembler subroutines. IAC COBOL is not the foundation for any other ‘modern’ languages that I know of; all of which seem to have more of a debt to ALGOL than anything else. But WTH, any language is only an interface for programmers who can’t speak machine language and the easier that interface is to use the better. Given a choice between a modern COBOL version and C++ I know which I’d prefer.

    Regarding the “missing period” problems in COBOL I always wondered if the compiler could tell you had left out a period, why couldn’t it insert it for you?

Leave a Reply to Lurker111 Cancel reply

Your email address will not be published. Required fields are marked *