Last week, Google released Gemini, their new AI model. It is an (apparently) impressive and well-produced demonstration of the capabilities this system can provide. During the video, in an overhead view, the model seems to capture, understand, and interact with every action the hands of the user produce.
It’s something impressive to see how software is at the point where it can understand our actions as much as produce the desired outcomes from our requests. Definitively we are entering a new age of possibilities that time will say how much it will disrupt or not different industries.
However, I could not avoid it while I was watching the video, thinking about the dark side of this. A system that tracks and understands everything you do is created by a company that has been tracking everything we do with our devices, gathering all our data for decades, almost without our permission, to keep us in front of their apps as much as possible so they can serve as many ads as possible to many money.
A new system that can do that now, not just with everything we do with our devices but with everything we can do in our lives.
After years of understanding how technology is having hundreds of undesired side effects, (addiction, social anxiety, polarization among many others.) It’s impossible not to see the dark future implications of this new technology.
So I decided to share my thoughts on LinkedIn.
A few minutes later, I got a message from a connection that stated: “What if they use it for people with visual disabilities? So they can recognize objects, and paper money, among other things. They’ll benefit by reducing their dependency on others to be assisted and do many other activities”.
That’s fair. It is an amazing perspective of the potential of this tool.
The point is that during these recent weeks, I have had several times the conversation about the dark sides technology is bringing to us. Often the reaction or response seems to be a positional debate, a perspective that tries to set a position opposed to the idea the the damage. Be optimistic, be positive. Don’t be negative. It seems you need to decide whether to choose good or evil, the light or the shadows, the Jedis or the Sight, you have to choose.
In my opinion, it's the wrong way to understand technology. It’s not that binary.
Responsibility isn’t. There is no possibility of choosing. You can not choose just the good intended consequences of your work. You are also responsible for the unintended or ugly ones. If your work produces or can produce any damage, you are as much responsible for it as the good impact it might cause. Choosing is facing no responsibility, it’s just intentional blindness.
It’s both, it’s a duality. And that’s why I believe we need to design and build thinking in both. Taking care of both. Focusing on creating new unimaginable value for humanity while diminishing potential damages.
Understanding that “duality” of technology is starting to take responsibility, and in times when people often seek to face no responsibility, taking it, seems also to need courage.
And BTW, as a note. Turns out the Gemini magic… Was not that magical.
Aitor González, founder at bettter.