Photo by Hannah Busing on Unsplash

The Entropy of Online Intimacy

Virtual Intimate Objects

Dmitry Zinoviev
3 min readOct 30, 2021

--

A Virtual Intimate Object (VIO) was a [2005, now defunct] low-bandwidth application that allowed a user to “send one bit of message to the other through Internet, … opening up a valuable and novel channel for communicating intimacy among couples.” (http://intimateobj.sourceforge.net/) It looked like a round button in the taskbar. When a user clicks the button, their partner’s button blinks, thus telling them that someone is thinking about them.

A cool, though somewhat useless app, but I do not mind its uselessness. I mind its description. In my opinion, a VIO does not send one bit of information.

“In information theory, one bit is the information entropy of a binary random variable that is 0 or 1 with equal probability” (Wikipedia). Yet, in the case of a VIO, the variable is NOT binary and the probability of the outcomes is NOT equal.

First, the communicated variable has at least three states: “Thinking about you” (transmittable), “Thinking about you but not being able or not willing to use the VIO” (not transmittable), and “Not thinking about you” (not transmittable). If these states are equally probable, the entropy of a blinking VIO is log2(3)≈1.58 bits.

However, the states are, in general, not equally probable. Unless you are a young passionate Romeo deeply obsessed with Julliette, or an old (30–40 y/o) passionate Othello deeply obsessed with Desdemona, most of the time you are probably NOT thinking about the object of your passion. The probability of the third state p3 is likely to be close to 1. Let’s assume that, realistically, p3=0.9.

In those rare moments when you are not asleep, not eating, not committing your edits to GIT, not attending a corporate Friday meeting, and not watching Dune or James Bond, you are unlikely to have access to your communication device (remember, VIO was at its peak in 2005). So, p2 is probably around 0.08. Finally, p1 is 0.02 to make the set of states complete.

By the definition of entropy, H=-sum(px*log2(px)), or -(sum(x*math.log2(x) for x in (0.9, 0.08, 0.02))) if you are a Python programmer. Substituting the numbers from above, we get H≈0.54 bits. That’s only half of what had been promised!

But what if you are a Romeo or an Othello, sleeplessly spending days and nights by your workstation and pinging your significant other? Then p3=0.9, p2=0.08, and p1=0.02, and, according to the entropy equation, H≈0.54 bits, again. Surprise!

The bottom line: if a variable is much more likely to have one value than any other value, knowing the state of that variable, on average, is not useful. If you know someone loves you, you already know that. Naturally, this does not mean that seeing a blinking VIO light is not pleasing.

My books on Python, Data Science, Network Analysis, and Software Engineering, published by Pragmatic Bookshelf.

--

--

Dmitry Zinoviev

Dmitry is a prof of Computer Science at Suffolk U. He is loves C and Python programming, complex networks, computational soc science, and digital humanities.