In their race to global dominion, many of the technology companies are ignoring such point of equilibrium between privacy and revenues. Facebook has long passed that point. They keep quibbling in the mud, but fail to grasp that the issue isn’t one of product design, but of their business model.
Other companies are following suit. The need for data in advertising models is but one example. With the rise of Deep Learning and new AI methods, the
need for datasets is more significant than before.
All of these models have a point of equilibrium; let’s call it the
Functionality-Privacy point (FP). If companies push past this point,
users will suffer. The loss of privacy isn’t just a moral matter, but a
mental health one. Coercing users into engagement submission will
break people’s mental and life balance.
-
Go Deeper: If the use of data for advertising models is terrible, the aggressive use of data in AI models is exponentially worse. When companies rest their product functionality in user data, they should think long and hard about the limitations of such a model. When every system we engage with is requesting, measuring or siphoning behavioral data without consent, users will rebel. There is a need to keep the Functionality-Privacy balance in every company we build.
Data shouldn’t be the economic linchpin of a company. Most AI systems require data, and that’s ok to a certain extent. Nonetheless, the primary revenue model of a company shouldn’t rest on data gathering. If data is at the heart of the model, the company will most surely, invert their priorities, from serving users to helping the capital markets.
In a world where most organizations are heading that way, running a balanced operation will become a value and not a drawback. It’s not far fetched to predict that a whole new crop of companies will make that balance their mighty sword. Apple is already capitalizing on it, but time will say if their underlying model for the next generation of products is pushed beyond the FP limit.