🥶 Some more cautionary notes on Artificial Intelligence

Previous threads have dealt with some of the issues with technology and its (un)intended consequences here and here, amongst others. I thought this latest topic noteworthy as a Roomba owner myself. While not dream-related, it does raise some questions about our technological environment.

From the article:
While the images shared with us did not come from iRobot customers, consumers regularly consent to having our data monitored to varying degrees on devices ranging from iPhones to washing machines. It’s a practice that has only grown more common over the past decade, as data-hungry artificial intelligence has been increasingly integrated into a whole new array of products and services. Much of this technology is based on machine learning, a technique that uses large troves of data—including our voices, faces, homes, and other personal information—to train algorithms to recognize patterns. The most useful data sets are the most realistic, making data sourced from real environments, like homes, especially valuable. Often, we opt in simply by using the product, as noted in privacy policies with vague language that gives companies broad discretion in how they disseminate and analyze consumer information.

1 Like

Glad you shared that. Sad to say I think that is just the tip of the iceberg of what devices do to people. And I think if people knew what these companies really did with their data, no one would be using their products.

Said it before, but worth saying again, we are in a new age of slavery.

2 Likes