For April Fools Day I’m offering a short story about a future world
that has moved entirely to cloud computing: Hardware
The cloud still scares as many IT managers as it attracts. But the
advantages of cloud computing for maintenance, power consumption, and
other things suggests it will dominate computing in a decade or so.
Meanwhile, other changes are affecting the way we use data
everyday. Movements such as NoSQL, big data, and the Semantic Web all
come at data from different angles, but indicate a shift from
retrieving individual facts we want to looking at relationships among
huge conglomerations of data. I’ve explored all these things in blogs
on this site, along with some other trends such as shrinking computer
devices, so now I decided to combine them in a bit of a whacky tale.