September 2020. I feel the wet, hardwood floor beneath my fingers. Summer air and water droplets touch my skin. My boots are strapped in and I am ready.

The guy gives me the signal. I tense up. He pulls a lever, and one second later I’m flying through the air.

I’m at a wakeboard park in Vienna, gliding through the cold water.

“This is life”, I think. A split second of slack on the line. I lose my balance. My head crashes against the water. Again.

These other wakeboarders make it look so effortless. And yet I can’t seem to complete a full lap.

Is there some key piece of information I’m missing?

We know what makes a good wakeboarder. Trial and error. A few simple rules. Maybe some basic physics.

Wakeboarding is easy though. It gets worse from there. Way worse.

Our limited understanding of reality is like a fog of war. We grasp around in the dark trying to make sense of it all, but we bump into barriers everywhere we go.

Imagine a 3D ball moving up and down through a 2D plane. The 2D creatures living inside this plane see a line appearing out of nowhere, expanding, contracting, and disappearing again. These creatures do not understand 3D space, they don’t know the concept of a “ball”, or “up” or “down”.

Much like these 2D creatures, we do not truly grasp what’s happening around us. We are limited by our senses and mental capabilities, which were forged by evolution to survive, not to understand reality.

The 2D creatures form a model of the world around them. They know exactly when the line appears. They take the line into account while traveling, as to not bump into it. But they don’t know *why* this line appears. They don’t know the underlying principle.

Our models become more and more complex to factor in more precise measurements. But we still don’t know why things are happening the way they do. In fact, we might not ever know.

Upgrading our mental models through science is slow and error prone, but it does work. It seems like the limit is rather evolutionary. Our brain simply can’t grasp certain things. It’s limited by the hardware. AI might save us. Artificial intelligence might be qualitatively different, impossible for us to imagine. And we might be closer than we think.

By the way, I’m not talking in the actual, real sense of having multiple dimensions. Even though we are too limited to visualize higher dimensional objects, we can at least understand and grasp them logically. I’m talking about other types of dimensions or concepts that we do not even begin to grasp.

I’m also not talking about scale here. Ants don’t know that we have cars and cities and electrical power lines. It’s easy to imagine a scaled up living creature that has access to technology we’re not aware of.

I’m talking about unknown unknowns here, that is why I’m talking in negatives. I can’t express what I mean because I don’t know what I mean.

The lower layers of a convolutional neural network learn about lines and colors and shades and eyes. This neural network learns to recognize the world around us. Except that it’s trained on a very limited subset of reality, namely photographs of visible light. What would these neurons learn based on other types of data? Data that is not – directly or indirectly – generated by our senses.

Maybe the Internet will save us. Massive, chaotic whirls of thought disperse through the cloud. Gigantic thought battles are being won and lost every second. Ideas spread and infect like viruses through the network. The individual neuron doesn’t know it’s part of a gigantic brain.

It feels like we are at the very beginning of our journey.