Putting Sensors To Work For Us

Like I said: Optimistic.

We’re highly-evolved sensors

From emotions to intention, our five senses process far more information than we know. While amazing, we’re not infallible. We might pay attention to noise or wrongly interpret what we observe. Sometimes, our neural pathways even become permanently crossed or inadvertently enhanced.

As humans, we problem-solve and over time evolve to deal with it. When a person loses a sense, other senses can step up, helping the individual to cope or compensate for the missing information. What we’re building in the technology community should evolve as well.

Technology + Connection - Privacy = ?

In an effort to not miss out, we connect, integrate, and register. We share — a lot. We’re so good at sense-making, it seems we can’t seem to get enough of it. One study says we check our mobile phones on the order of 150 times per day. And our technology even “helps” us, monitoring and notifying us in the background on “our behalf.” It also allows us to pay attention to signals far outside of our immediate purview. Teaching ourselves to ignore more and share less in some ways goes against our nature.

In Dave Eggers’s newest novel, The Circle, he paints a picture of what happens when we give ourselves over completely to a technologically-intertwined, privacy-less, cyborg-like, public existence. It’s scary. And it doesn’t feel that far off.

Our desire for sense-making becomes signal-making for companies. The institutions with the capabilities to do so are able to take advantage of the information and us. In turn, we lose touch with the real world. We’ve tuned in, and in the process, we fail to turn on and drop out as Timothy Leary and Marshall McLuhan suggested we do.

Can we or should we put a stop to it? Doubtful. As Paul Virillio wrote, “The invention of the ship was also the invention of the shipwreck.” Like us, technology evolves.

But why should “they” get the power from our data and not “us”?

Turn it around - Individual Data Platforms

I think we should be building services — “Individual Data Platforms”— that help people do more with their own data. These platforms should be built to serve us, not observe us with humans at the center of the technology.

These platforms would allow us to collect information that happens over time, even when we’re not interested or available to process it. They’d be powered by sensors, or “listeners,” that we deploy anywhere at our behest to collect information on our behalf. We could expose “feelers” that expose data at our direction in exchange for something of value.

For example:

I’m selling some dining room chairs: “Listener, tell me what the right price is given the activity on the web over the last 3 weeks.”

I’m considering entertaining offers for a new job: “Feeler, tell me what the market looks like right now for somebody like me.”

Such platforms might allow me to plug in less — and in turn, be more present in my daily life — because I won’t need to compulsively check on or keep track of the latest update.

In my prior post, I expressed skepticism of products that seek to explicitly transform my behavior (versus enhancing the outputs of my existing behaviors). That doesn’t mean I wouldn’t love to eat better, exercise more, and call Mom more often.

Maybe these platforms would help us overcome the habits and the tendencies that hold us back and make us less happy, such as overindulging, overexerting, under-experiencing, and even over-connecting.

Building Ethical Platforms

How do we keep these services above board and encourage them to continue to serve us? Brewster and IFTTT seem to be built in this spirit. And iBeacon is a technology that could be used this way.

We might adhere to a set of “principles” and key features for data platforms. Principles like:

Doesn’t this just mean we’re using technology to reverse some of the ill effects of technology? Probably. But I’ll take it, in favor of the benefits.