"Instead of being confined to phones and computers, our digital lives would extend to a network of screens all around us, from connected TVs and smart fridges to kitchen countertop displays and car dashboards. The tech companies called this "ambient computing" or "ubiquitous computing" and extolled how it would get technology out of the way so we could focus on the real world. Here's what we've got instead: They got the ambient part right, in that we've now surrounded ourselves with screens we don't control. But instead of blending into the background, the screens are now doing the opposite, distracting us with ads in hopes of padding their makers' bottom lines."
"Ambient computing got its start in a more idealistic setting, in the late 1980s at Xerox Palo Alto Research Center. Mark Weiser, then the head of PARC's computer science lab (and later its chief technology officer) used the term "ubiquitous computing" to describe how an array of screens in various sizes-" tabs, pads and boards "-would all work in tandem to help people accomplish everyday tasks. "Machines that fit the human environment, instead of forcing humans to enter theirs, will make using a computer as refreshing as taking a walk in the woods," he wrote."
Ambient computing originated at Xerox PARC in the late 1980s with Mark Weiser's vision of coordinated screens of varying sizes that would work together to assist everyday tasks. Advances in processors, displays, and connectivity later made the concept technically feasible and prompted experiments from major tech companies for kitchens, walls, and shared spaces. The ambient aspect succeeded in surrounding people with networked screens. Those screens now prioritize advertising and distraction rather than receding into the background to support unobtrusive human activity. The commercial drive for monetization altered the original user-centered intent.
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]