On metric time

HumanTime

About a decade ago, I thought briefly, about how the day would feel, if we had some sort of metric units instead of hours & minutes, then forgot about it.

About six years ago, I had an Android phone, and wondered what it would take, to make an app for this. At the time, I looked into OpenGL support, and had some basic screen working, but the state of app development seemed like really grungy to me, and I forgot about it again.

About a year ago, I had an iPhone, and wondered what it would take to make an app for this. I had still never made a mobile app, but SwiftUI had recently been released, and … the state of app development seemed way less grungy, and SwiftUI did live up (mostly) to its promise of being a (mostly) declarative framework … so I made a prototype (using centi-days and milli-centi-days), and then forgot about it again.

Last month, I finally pushed it “over the line”. I had to fill out some forms, make some placeholder icons, and come up with a name.

I’ve always thought of the metric system being based on “tens”, and “ten” feels very “human”, so I called it HumanTime. (I don’t have the necessary free time these days, but it would be good to eventually round this off with a Watch app too).

Anyway, there’s no moral to this story, other than “I built a (small) thing” , “it’s good to make things“, and “I wish I’d done this earlier”.

If you want to take a look, it’s on the App Store: here.

#making
#apps

Comparative Latencies

It is usually hard to get an idea of how the time taken for various fundamental operations varies, and it does matter, but it’s hard to viscerally get it (time intervals in nanoseconds, microseconds, milliseconds aren’t really felt in the same way).

I came across this idea of representing the smallest number as a single second and everything else in terms of it, so that the relationship between the numbers is represented in more of a human scale, which results in the following table:

Op Latency Table

I wanted to show this in a chart, but it never shows more than the last two values, so I had to break it down into a series of smaller charts (I could use a log scale to represent them too, but that would’ve again lessened the impact you feel when seeing these numbers side by side)

Mostly similar, as long as it’s on the chip

 

This is a big jump!

Tremendous jump! Main memory latency is like 0.1% of the SSD access time!

… so imagine how much slower disk is, compared to RAM.

And if that was slow, you should check out the internet …

… exceeded only by an operation on the “machine”; this is finally when we can feel seconds ticking by.

Op Latency 6
Obviously, the worst thing you can do is try to restart the “real” system.

 

But then I had a whimsical thought; the sort of thing that seems at once not-impossible and yet such a long shot that one can just relax and enjoy exploring it without feeling under pressure to produce a result in any particular timeframe (and yet, I have moved my thinking forward on this over the years, which keeps it interesting).

What if we could find a way to take advantage of the fact that our logic is embedded in a computational system, by somehow bleeding off the paradoxes into mere nontermination? So that they produce the anticlimax of functions that don’t terminate instead of the existential angst of inconsistent mathematical foundations ?