My history with computers, Part 3

Old computers sometimes had a “Turbo” or “Boost” button to manually switch to a higher clock speed. Toggling this on and off could count as a valid game-playing strategy, if you needed to “speed past” obstacles, etc. Yes, it’s just as bizarre as it sounds.

Context

In the last decade (and half, roughly) people have gotten used to a lot of niceties in our operating systems, smooth integration between different devices, nifty apps, wonderful cameras, and more — but not increases in speed.

It is hard to convey how different this aspect was in the 90s. Every year, sometimes twice a year, there were glossy magazine advertisements about faster computers.

A new computer

So around 1998, it was possible to buy a new computer, with a CPU rated at 233Mhz. Two hundred thirty-three megahertz. It also had a fancy new operating system, the just-released Microsoft Windows 98 (ooh 😐).

There was an actual sound card (something that isn’t thought of as a “pluggable thing” any more), which meant it was possible to get speakers to play actual sound (today if you buy speakers, it’s as a part of your room, not as a part of your computer).

The display (or “monitor”; heh, no one uses that word any more) had color, and there was a mouse that could be plugged in, and this computer didn’t just have a floppy drive, but a new optical media, the CD-ROM.

Aside: relative speed evolution

The first computer at our home, mentioned in the earlier post (late 1994), had a CPU with a clock speed of 33 Mhz. Thirty three megahertz (btw this seemed huge to me at the time: “so many calculations in a single second!”)

My first ever personal desktop (mid-2002, more on this later), had a single-core CPU with a clock speed of 1Ghz. One thousand megahertz, or a 30x increase.

My first MacBook (mid-2008) had a dual-core CPU rated at 2.0Ghz. Two thousand megahertz, or a 2x increase.

My current MacBook Pro (mid-2019) has a 8-core CPU rated at 2.3Ghz. Two thousand three hundred megahertz.

My iPhone (early 2018), uses the “A11 Bionic” with a maximum clock rate of 2.39Ghz. Two thousand three hundred ninety megahertz.

You can imagine the graph in your head.

Programming

QBasic was gone, to be replaced with … Visual Basic. This allowed a lot of experimentation with simple forms, but I didn’t really have any ideas on what to do with it, so I let it lapse.

There was also Turbo C , which, despite the name, was a reasonably popular language environment (from Borland, which is not a name most would recognize today, but at the time, it was … like JetBrains plus Visual Studio, and more). There weren’t a lot of materials to learn from, though I remember at least being able to copy in a few examples, and so on.

Still later on, around 2000-ish, I got some game programming books, and really liked learning from them, since it was very straightforward to build something with DirectX (never mind) in C .

Nothing comparable to the vast tools and materials available to kids these days, but … good times.

Apps

There were a whole bunch of computer magazines that came with CDs, containing free trials of all sorts of stuff, and it was something to look forward to every month — to try out whatever was new that month: install it, fool around with it, then delete.

I wish I had pictures or notes or anything, but I don’t, so this vagueness will have to do.

I do remember the first time I used Microsoft Flight Simulator (which, btw, is making a big comeback). Even with that relatively poor graphical resolution, a 3-d experience of this sort was magical.

Something else that stands out: Microsoft Encarta. It was the first digital encyclopedia and they did a really good job of it. There were audio and video clips, lots of articles to read and switch between.

The pros and cons with paper should’ve been apparent already: the content was beautiful, though I can’t imagine someone spending hours and hours interacting with Encarta they way I can imagine someone spending that time with a paper version (but maybe that’s just me).

Aside: the time of Microsoft

In case it isn’t obvious: yes, this was the decade of Microsoft domination, something that people have no gut feeling for anymore — but twenty years ago, before big-Google, big-Facebook, big-Amazon, big-Twitter, big-Netflix, big-Apple, there was only big-Microsoft.

Games

This was the highlight of my time with the machine 🙂

FPS

First of all, I finally had something to play Quake with (the minimal RAM requirements were 8MB; our earlier computer had 4MB, while this one had 64MB. As a fun exercise, try to find out how much a single tab in your browser is using right now).

Quake was made by the same company (ID Software) that made Wolfenstein (which we had played so much of on our earlier computer), and Doom (which I missed out on for whatever reason). Again, this is something hard to convey now, but these were iconic first-person shooter games, the very first ones, in fact … which is probably why they were popular, since they seem quite boring by today’s standards.

Anyway, Quake was just the beginning, this machine was in a sweet spot to play most of whatever came out, and the free apps on the CDs in the monthly computer magazines were usually free games.

Aside: single-player gaming

Although much remains the same in games over the decades (apart from the massive improvement in their visual appearance), something that is very different is the experience “un-connected” solitary game.

Most games today either directly involve other players, or indirectly (through comparison in a leaderboard, etc). I think the only equivalents of “playing something alone, immersed in the world” are certain mobile games, like Monument Valley, etc. where you own the game, you play the game, and no one else really knows about how you played, the experience is yours alone.

Early on, everything was like this (although it was quite common for friends to sit along side you as you played, so there’s that).

RPG

Just as ID Software dominated gaming in the first-person shooter genre, in the first part of the decade, another company, Blizzard Entertainment (of the two, still going strong!) dominated role-playing games.

All that’s needed to convey this are a few names: Diablo, Starcraft, Warcraft, each of which I probably spent hundreds and hundreds of hours on.

It’s worth mentioning that there was a lot of competition early on, and the only reason these stand out is that they balanced a lot of factors in RPGs very well, designing the details very, very well.

Note 1: If I had to pick a favorite, it would be Starcraft.)

Note 2: But, more on all this some other time, especially an account of this one game that was insignificant but that I liked: Microsoft Urban Assault

Aside: storage media

Going from a floppy disk that stored 1.44MB to a CD-ROM that stored 650MB was a big change, one that really opened up a whole variety of new, rich content.

DVDs and Blu-Rays went an order of magnitude higher each, but have been used for richer and more detailed versions of existing content and not newer kinds of content (in my opinion).

There were other stops along the way, and not just for alternatives like HD-DVDs that no one remembers. For a while it was quite common to have a “Zip drive”, awkwardly between a floppy and a CD.

(Of course, a new laptop today has neither of these)

Aside: man and machine

I should point out something: I had a certain sort of … affection … for the first computer we had (I remember being upset and crying once (embarrassing, right) when it didn’t start and appeared to be broken), in a way that I didn’t have for the second one (which was “just a machine”), or any of the countless ones (laptops, desktops, tablets, phones, watches, appliances) since.

It might be a pets vs cattle thing, dunno.

Transition

I haven’t really thought through the episodic nature of this series, which means there isn’t any plan of having “equal chunks”. But yes, we’ll plod along steadily. Next time: the internet (!)

My history with computers, Part 2- “Mid 90s

Context

Picking up where I left off last time: we’d got a first, new computer, the first set of simple games, and a first operating system (ye olde DOS).

QBasic

As I’ve mentioned, the only programming environment, programming interface, programming tool, programming editor I knew about or used, was the version of QBasic that came bundled with MS-DOS.

This might sound pathetic now, but felt very cool to me back then. I hadn’t experienced “programmable calculators”, so this was also the only “programmable thing”.

This beige box was the only thing that could compute at all. All this sounds redundant, because we have so many little computers all over the place now and they’re ubiquitous, but it’s hard to give an idea of how unique one of these was.

(Like this, except in black and white)

Everything was one giant file, with jumps all over the place. Incredibly messy, and IMO a terrible way to learn how to write anything (so much to unlearn later, sigh). But still, a great way to get started MAKING stuff.

Using it

Just to give an idea, here’s how a sample interaction might go (say I wanted to make some random patterns or drawing, in my “computer time” that day):

  • The computer is off, put it on (the idea of leaving it on all the time would have been crazy!)

     

  • It boots into the C:\> prompt, pretty quickly (no logins, single-user!)

     

  • I run QBASIC, see the screen above

  • I write some small fragment like

    SCREEN 1
    LINE (35, 50) - (100,150), , B
    
  • I hit run, and see a small rectangle (in the beginning, coming from LOGO, this is most of what I did)

     

  • I press a key, back in the editor, make some changes, repeat.

Game programming books

The installation of QBasic came bundled with an impressive game that gave the impression that a lot was possible (the code was very spaghetti-zed, but I supposed relatively okay).

(Like this, except in black and white)

At the time there were also a lot of books with games — by which I mean they had programs that you could type out and run (remember how I said everything is “just one large file”?)

I was fortunate my mother could bring these from the library of the school she worked at, and I learnt a lot (okay, questionable, but it definitely felt good) from reading through them.

I was also fortunate that my younger brother (with great patience!) would read aloud each line for me to type in, so we could play the game later.

One of these was a Star Trek game, the longest of the bunch, that we painstakingly typed in over several days, slowly making progress page by page (sounds ridiculous as I type this, but … yeah, you had to be there), and inevitably I must have made some small typo somewhere, that was then … impossible to track down, so we were quite dejected when it didn’t work.

However, the opening theme did play, with its signature tune, and that felt good. I should note that there were no speakers, so there was no sound, just a beeper with different frequencies.

I did try to read the source code, and it was essentially a “first RPG”, with an energy level that got depleted by actions, movement in a grid in space, ability to fire phasers and photon torpedoes, all that good stuff.

(I googled it, and … of course there’s a Youtube video for this: https://www.youtube.com/watch?v=gLKw4AU4KHU)

Windows

I had seen Windows machines at school, when I finally got a chance to play with one of two computers that had a mouse. All I did was use MS Paint, because moving the cursor, pointing and clicking and seeing dots of color appear was such a novel experience!

Finally, one day, my dad brought a stack of floppies (because that’s how anything got installed) for Windows 3.11. It required baby-sitting the whole install experience, removing and inserting each floppy in the precise order in which they were labelled.

GUI

Now, after starting up the computer, at the c:\> prompt, it was possible to run win, and then see a brief splash screen.

(Like this, except in black and white)

After which there would be, well, windows on the screen.

(Again: this, except in black and white)

Things were getting exciting.

Remember though, still no mouse (that would come a few months later). So we got really good at keyboard shortcuts for selecting windows, moving, resizing, whatever.

Apps

A big boost came from getting (in another long bunch of floppies) Microsoft Office (!)

Each app took about a minute to load up, but we could now use (still one at a time) Word, Excel, Powerpoint, and Access!

I remember Access has a dialect called AccessBasic that I read the manual for, tried to use, and failed. I wanted to make a “home library system”, but spent all my time prettifying the frontend and never quite got the flow of entering books and looking them up to work properly.

I vaguely remember repeating this painful install process a couple times, and using Norton Disk Doctor and a bunch of other tools that simply have no analogue today.

Games

Bundled games included Minesweeper and Solitaire, though I never quite liked them all that much.

At this time, windows was very much (until Windows 95, I think) a “shell within DOS”, so it was quite normal to play some games within Windows, and to exit Windows and play some games within DOS.

As far as I can remember (again, I wish I had written something down), there were better games in DOS, especially the ones my brother got from his friends.

One game stands out: Wolfenstein. Again, this was black-and-white without sound, but … it was the first-ever FPS. Let me repeat that: the first-ever first-person shooter (for me, at least). All I had seen were flat, 2-d games, maybe a few platforms, and … here was something totally different.

(Again, like this, except black-and-white, and no sound)

In today’s world of bloated “built on Electron” apps, it’s nearly impossible to appreciate the skill that went into creating an experience like this on a limited computing platform such as we had.

I do remember a visual Chess game on Windows, perhaps Chessmaster?

Transition

Time to stop again, so I can come back and write again later. Next time: an upgrade.

My history with computers, Part 1: “Early 90s”

An early computer using the Intel Pentium

Context

As hinted at in a previous post, I thought I’d go over a history of my interaction with computers. This was … harder than I thought, mostly because I barely remember anything. If only I had pictures, notes or journals, sigh (so, I picked a generically representative image here above). Still, it’s a useful exercise to try to recount all this, so I will do my bit.

Chronologically, this post is set between roughly 1994 and 1997.

Images

Before I physically saw a computer or used one, I knew of its existence through magazines, and reference books1.

I do remember one occasion where someone I knew bought a big and expensive computer at their home, and I got to see it, and was very impressed by the (at the time, very, very novel) color graphics display and a mouse.

School

I was fortunate to have a “computer lab” at my school. It was populated by what would today by utter relics, not notable enough to feature even in a museum2.

Yet with no context and not having every physically touched anything else, they were, of course, quite marvelous to me. They were even then oddballs, one-offs — and they had to be, because hardware was incredibly expensive then! — but I do recall a good number of them being BBC Micros3 (and there might have been a solitary Sinclair4).

Two features to note here, common to each:

  • a Floppy drive, the sole mode of connection to the outer world (no network of any sort), and also the sole means of storage (yes, no hard drives either!)
  • Basic 5 as the sole programming language (okay, it was Logo before that for a while)
  • Black-and-white raster graphics on a roughly 14-inch screen. Yep.

Thinking back, I can accept the floppy drive (noisy and slow, and this was the older 5-1/4” jumbo drive, btw), but thinking of how anything to do with “real programming” was limited to Basic makes me tear my hair out. It made it so hard to imagine how anything else was made.

Home

It was a big deal then, when we at home got a computer of our own. This was relatively very expensive at the time, and I’m fortunate to have parents who spent money on this as opposed to almost anything else for themselves.

So one day we had a shiny6 new 386 7.

It had 4MB of RAM8 , and a 256MB hard-drive9, along with a 14-inch black-and-white raster display.

It ran MS-DOS 6.2210 and came pre-installed with … yes, Basic11

Over time, we got “office suite” applications, and I felt very accomplished as I learnt Lotus 1-2-312, DBase 413 and WordStar14.

There were early games (e.g. Arkanoid15, Dave16) at this point, which me and my brother played in isolation, though later on, he brought cooler games (e.g. Commander Keen17) from his friends.

Transition

I’m going to stop here, because I could go on and on otherwise, but also because this was fun to write and if I actually “get this out”, I can actually write “the rest” too.


  1. Which were pretty good for the time, btw, remember this was before the widespread advent of “the web”, e.g. this Time-Life series ↩︎
  2. I’m thinking, for example, of the Computer History Museum ↩︎
  3. I think of them as the big hulking Raspeberry Pis of that era ↩︎
  4. maybe this one ↩︎
  5. Or rather, GW-BASIC ↩︎
  6. I used to keep taking off and putting on the dust cover on it. Seriously. ↩︎
  7. At the time, I remember the “range” of computers were roughly defined as Intel chip generations. So, 186, 286, 386, 486, and 586 a.k.a. the “Pentium”. This sounds silly now, the equivalent of people deciding whether to buy a Kaby, Coffee, Comet or Cooper -lake today ↩︎
  8. I remember thinking, wow, 4 million sounds so big! ↩︎
  9. A big upgrade from the floppy-disk-only machines I had seen earlier ↩︎
  10. I remember the version because I read the manual (cringe) front-to-back a couple of times ↩︎
  11. Or rather, QBASIC ↩︎
  12. it was an early instance of what is today called a “killer app” ↩︎
  13. I found an old manual (!) that shows what it looked like … and I was fascinated/horrified to see that its newsgroups are still active ↩︎
  14. Hey, R. R. Martin still uses it ↩︎
  15. I always felt this was a crazy name for a glorified pinball machine ↩︎
  16. you can play this in your browser today! ↩︎
  17. I don’t know which version, maybe this one ↩︎

Monthly Curations: November 2019

What the Apollo Guidance Computer looked like

The “trinity of computation”

I never thought of it this way

Logic tells us what propositions exist (what sorts of thoughts we wish to express) and what constitutes a proof (how we can communicate our thoughts to others). Languages (in the sense of programming) tells us what types exist (what computational phenomena we wish to express) and what constitutes a program (how we can give rise to that phenomenon). Categories tell us what structures exist (what mathematical models we have to work with) and what constitutes a mapping between them (how they relate to one another). In this sense all three have ontological force; they codify what is, not how to describe what is already given to us.

In this sense they are foundational; if we suppose that they are merely descriptive, we would be left with the question of where these previously given concepts arise, leading us back again to foundations.

Memristance is futile. Not.

I came across this wired article recently, and what I read sounded too science-fiction-y to be true, so then I decided to go to the source, and found this video (see below) by a researcher at HP, and it turns out to be both true and “science-fiction-y”.

We are used to thinking in terms of standard circuit elements — resistors, capacitors, inductors. One establishes a relationship between voltage and current, the second better voltage and charge, and the third between magnetic flux and current.

Now it never occurred to me to really think about it this way (it’s one of those things that’s only obvious in hindsight), but there is a missing piece of symmetry here.

Look at that list again, and it might jump out at you that among current, voltage, charge and magnetic flux, they’re related in pairs to each other, with the exception of charge and magnetic flux. Seeing this now, it might be reasonable to speculate on another circuit element that should do precisely that. And indeed someone did, about forty years ago, and named the missing piece the memristor.

Now I should acknowledge that there is a bit of controversy whether what HP labs claims to have discovered really matches up with this idea, so we’ll just have to wait a few years to test these claims, since the first commercial applications of this technology won’t be out for another five years at least.

But let’s continue. One of the observations made in the video linked I above is that the memristance obeys an inverse square law. This means the tinier the dimensions, the greater the observed effect. Which also means this is something that would belong purely in a chip, and not something you’d be putting on a breadboard any time soon.

The most exciting property, though, is that it’s behavior in the future depends on its past. So it is both a logic component as well as a storage component. So you could build a dense cluster of these things and determine which parts do what function, in a configurable sense, much like an FPGA on steroids.

I used to think (again, only because this is what I was taught) that the fundamental logic component was a NAND gate — but this turns out not to be true. Instead, it turns out that if we consider the interaction between input A and input/output B expressed using memristors, as an IMP gate, then we can construct a NAND gate out of these.

Further, multiple layers of these memristors can be stacked above a conventional CMOS layout, and densely packed together, leading to unprecedented on-chip memory, perhaps on the order of petabits!

So, how would this change things? It would certainly deprecate the SRAM ->DRAM->Hard Drive pyramid of caches we have right now, and we would not only have an ocean of universal memory, but our processing elements would be floating on this ocean, and entirely commingled with it!

We certainly won’t need to deal with the Von Neumann bottleneck any more …

Comparative Latencies

It is usually hard to get an idea of how the time taken for various fundamental operations varies, and it does matter, but it’s hard to viscerally get it (time intervals in nanoseconds, microseconds, milliseconds aren’t really felt in the same way).

I came across this idea of representing the smallest number as a single second and everything else in terms of it, so that the relationship between the numbers is represented in more of a human scale, which results in the following table:

Op Latency Table

I wanted to show this in a chart, but it never shows more than the last two values, so I had to break it down into a series of smaller charts (I could use a log scale to represent them too, but that would’ve again lessened the impact you feel when seeing these numbers side by side)

Mostly similar, as long as it’s on the chip

 

This is a big jump!

Tremendous jump! Main memory latency is like 0.1% of the SSD access time!

… so imagine how much slower disk is, compared to RAM.

And if that was slow, you should check out the internet …

… exceeded only by an operation on the “machine”; this is finally when we can feel seconds ticking by.

Op Latency 6
Obviously, the worst thing you can do is try to restart the “real” system.

 

Lispium ?

What if you did the following:

  • Take a chromebook
  • Modify the chromium build running to run Sbcl within it.
  • Create lisp bindings to the internal surface, so that all UI elements can be created and manipulated within the Lisp image.
  • Allow downloading, compiling and running arbitrary lisp code
  • One of the tabs is always a Repl
  • Add a caching filesystem that would persist part or whole of the image

… might this create a modern-day Lisp machine? Maybe.

Did I miss anything obvious here? If not, this sounds doable in a few years.

I’m lazy, do you if you like this idea (I’m sure there’s a guaranteed niche market for these machines), go ahead and throw it on to Kickstarter. Or something.

What has speed brought ?

Many good things, to be sure, but more has been omitted.

Perhaps Kent Pitman expressed it the best:

My only problem with this is that things are ALREADY sped up. What’s the point of running a zillion times faster than the machines of yesteryear, yet still not be willing to sacrifice a dime of it to anything other than doing the same kinds of boring computations that you did before? I want speedups not just to make my same old boring life faster, but to buy me the flexibility to do something I wasn’t willing to do at slower speeds.

Is JavaScript the new Lisp?

I seem to have been blissfully unaware of just how far JavaScript has come over the last decade or so.

I wanted to make a foray into mobile app programming (ah ok, nothing serious! Just a toy game or two) — and when I looked around, it looked like I had to deeply immerse myself into either the entire iOS ecosystem or the entire Android ecosystem.

Well, in fact, it’s worse — because I first have to make that choice!

So there are platform-independent alternatives — Xamarin (C#? no thanks), Titanium (maybe) and PhoneGap (heard good things!) Eventually though I came across [this nifty open-source framework], that seemed like it would fit my use case of “a toy game project” just fine.

It was super easy to get started (hey! a simulator in the browser! — contrast that with a completely non-working emulator for Android). But very soon I ran into (what seemed like) a huge problem — how the $#% was I supposed to debug anything here?

The running server (context: the app development environment is basically a node.js environment) just printed nonsensical error messages about “native view: undefined” or some such. This was horrible! How did anyone ever use this?

And then I encountered the obvious solution — the tooling for JavaScript is just really, really good, right out of the box. The simulator mentioned earlier is just another object in my web page — so I can drill into it, print out the values of any running objects, and — isn’t this the live-debugging nirvana? — modify them in-place to fix any error and resume.

Put that in your REPL and smoke it

The JavaScript console (sorry, my experience so far has only been with Chrome) is phenomenal. I can evaluate anything I like, getting not just text, but entire arbitrary-nested hierarchical objects dumped out for my inspection.

Yes, there is the whole “dynamic types => errors from typos” problem, and I ran into this pretty early on when a missing comma gave me a lot of grief. But this is somewhat made up by the source-level debugging at the console, where I can see the problem, and fix it right away!

WTF? Everything just works? And they’re tons of libraries too!

And here I was, thinking that the solution to the Lisp GUI problem was to tie in Webkit bindings to an MVC framework, to create a modern version of CLIM — but there’s already a (non-lispy) version of that out there!

(I’m confused)