AI abstinence won't work
Briefly

AI abstinence won't work
"There's the latent ickiness of its manufacturing process, given that the task of sorting and labeling this data has been outsourced and underappreciated. Lest we forget, there's also the risk of an AI oopsie, including all those accidental acts of plagiarism and hallucinated citations. Relying on these platforms seems to inch toward NPC status-and that's, to put it lightly, a bad vibe."
"Without our consent, the internet was mined and our collective online lives were transformed into the inputs for a gargantuan machine. Then the companies that did it told us to pay them for the output: a talking information bank spring-loaded with accrued human knowledge but devoid of human specificity. The social media age warped our self-perception, and now the AI era stands to subsume it."
AI training requires immense energy and contributes to runaway carbon emissions. Widespread data collection has produced numerous privacy failures, including exposed ChatGPT conversations and prolonged retention of user chat history during legal disputes. Data labeling and curation work has been outsourced and undervalued, creating ethical and labor concerns. Large language models produce errors such as plagiarism and hallucinated citations. Reliance on AI tools promotes cognitive offloading that erodes skills like navigation and knowledge retention. Internet content was mined without consent to build these models, then monetized as homogenized outputs lacking human specificity. Some people avoid AI as a form of cognitive and ethical self-preservation.
Read at Fast Company
Unable to calculate read time
[
|
]