• Print

The next, next big thing

The web had its day. Mobile is already peaking. So what's next?

In my old age, at least for the computing industry, I’m getting more irritated by smart young things that preach today’s big thing, or tomorrow’s next big thing, as the best and only solution to my computing problems.

Those that fail to learn from history are doomed to repeat it, and the smart young things need to pay more attention. Because the trends underlying today’s computing should be evident to anyone with a sufficiently good grasp of computing history.

Depending on the state of technology, the computer industry oscillates between thin- and thick-client architectures. Either the bulk of our compute power and storage is hidden away in racks of (sometimes distant) servers, or alternatively, into a mass of distributed systems closer to home. This year’s reinvention of the mainframe is called cloud computing. While I’m a big supporter of cloud architectures, at least at the moment, I’ll be interested to see those preaching it as a last and final solution of all our problems proved wrong, yet again, when computing power catches up to demand once more and you can fit today’s data center inside a box not much bigger than a cell phone.

Thinking that just couldn’t happen? You should think again, because it already has. The iPad 2 beats most super computers from the early ’90s in raw compute power, and it would have been on the world-wide top 500 list of super computers well into 1994. There isn’t any reason to suspect that, at least for now, that sort of trend isn’t going to continue.

Yesterday’s next big thing

Yesterday’s “next big thing” was the World Wide Web. I still vividly remember standing in a draughty computing lab, almost 20 years ago now, looking over the shoulder of someone who had just downloaded first public build of NCSA Mosaic via some torturous method. I shook my head and said “It’ll never catch on, why would you want images?” That shows what I know. Although to be fair, I was a lot younger back then. I was failing to grasp history because I was neither well read enough, nor old enough, to have seen it all before. And since I still don’t claim to be either well read or old enough this time around, perhaps you should take everything I’m saying with a pinch of salt. That’s the thing with the next big thing: it’s always open to interpretation.

The next big thing?

The machines we grew up with are yesterday’s news. They’re quickly being replaced by consumption devices, with most of the rest of day-to-day computing moving into the environment and becoming embedded into people’s lives. This will happen almost certainly
without people noticing.

While it’s pretty obvious that mobile is the current “next” big thing, it’s arguable whether mobile itself has already peaked. The sleek lines of the iPhone in your pocket are already almost as dated as the beige tower that used to sit next to the CRT on your desk.

Technology has not quite caught up to the overall vision and neither have we — we’ve been trying to reinvent the desktop computer in a smaller form factor. That’s why the mobile platforms we see today are just stepping stones.

Most people just want gadgets that work, and that do the things they want them to do. People never really wanted computers. They wanted what computers could do for them. The general purpose machines we think of today as “computers” will naturally dissipate out into the environment as our technology gets better.

The next, next big thing

To those preaching cloud computing and web applications as the next big thing: they’ve already had their day and the web as we know it is a dead man walking. Looking at the job board at O’Reilly’s Strata conference earlier in the year, the next big thing is obvious. It’s data. Heck, it’s not even the next big thing anymore. It’s pulling into the station, and to data scientists, the web and its architecture is just a commodity. Bought and sold in bulk.

Strata job board
The overflowing job board at February’s Strata conference.

As for the next, next big thing? Ubiquitous computing is the thing after the next big thing, and almost inevitably the thirst for more data will drive it. But then eventually, inevitably, the data will become secondary — a commodity. Yesterday’s hot job was a developer, today with the arrival of Big Data it has become a mathematician. Tomorrow it could well be a hardware hacker.

Count on it. History goes in cycles and only the names change.

Related:

tags: , , , , ,
  • Canuck

    I agree with a lot of what you wrote — sees it, done it — but it’s not true that things are strictly cyclical. Behind the rapid industrial cycles over the past 200 years, for example, there has been a real trend away from people being workers (mindless cogs in a big production machine) towards people being autonomous agents (sitting at home or in an office working on a computer, etc.). In a sense, that’s a return to the craftsman/artisan culture from before the Industrial Revolution (substitute cobbler’s tools, a spinning wheel, etc. for the computer), but not really, because it’s open to far more people than the life of an artisan was.

    In tech itself, while fads like mobile apps, AJAX, social media, etc., there has been a general upward trend from closed to open. People expect to be able to make their own choices about what they do with information and devices. While there are temporary backwards steps (like Apple’s locked-down iOS), the general trend is to open: get whatever information you want, and use it any way you want. It’s not just a matter of a closed device working, like a toaster or VCR — kids (like my teenagers) don’t accept the kinds of technological boundaries and fences we did 20-30 years ago.

    This trend isn’t a quick fad, and it’s happening slowly — people aren’t going to throw out their iPads tomorrow, or even next year — but we can probably both agree that any strategy based overly-strictly-controlled platforms and information will eventually end up on the wrong side of history.

  • http://web.ncf.ca/shawnhcorey/ Shawn

    “It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years.”

    — John von Neumann, circa 1960

  • http://www.sidlr.com han

    It is a narrow view to think of the Web as already tried and tired.

    In fact, we have only just scratched the surface, barring the narrow, narrow definition of the World Wide Web as HTML documents that appear next to a URL.

    The Web is the Web of Things, encompassing the mobiles, including the fancy iPhones, big data, geolocation checkins, micro-commerce and all.

    As we trudge ahead, we need to continuously measure and at times, retrofit our latest gadgets and whims into the grander scheme of things. Don’t be tunnel-visioned pursuing the next bling-bling and coining it the next Web 3.0.

  • rickhap

    My comment with supporting example of how people are making money from packaging and presenting data was deleted. The comment was appropriate and supported your statement that data is the next big thing.

  • http://radar.oreilly.com/reichental/ Jonathan

    I don’t think we’ve even started to see the magnitude of change ahead of us as a result of an always-on, socially-connected, mobile-enabled, information-at-the-fingertips global culture.

  • wcoenen

    There isn’t any reason to suspect that, at least for now, that sort of trend isn’t going to continue.

    Except for the fact that transistors cannot continue to shrink at the usual rate, lest they become atom-sized sometime around 2020.

  • http://radar.oreilly.com/aallan/index.html Alasdair Allan

    Except for the fact that transistors cannot continue to shrink at the usual rate, lest they become atom-sized sometime around 2020.

    Not necessarily that really relevant anymore. Have you heard about Intel’s new Tri-gate transistor, http://arstechnica.com/business/news/2011/05/intel-re-invents-the-microchip.ars? Take this, and other advances like silicon photonics, http://arstechnica.com/gadgets/news/2010/07/the-future-of-electronics-is-frickin-hybrid-silicon-lasers.ars, and a general trend away from single to multiple core architectures, and the days when we relied on transistor size shrinking to provide better performance are long gone.

  • wcoenen

    …, and the days when we relied on transistor size shrinking to provide better performance are long gone.

    I don’t think so. The commercialization of multi-gate transistors (like Tri-gate) demonstrates that the industry is determined to keep shrinking silicon transistors to the bitter end. They do this in spite of gigantic R&D costs precisely because performance improvements are still driven by shrinking transistors. But by 2015 we’ll have 11nm processes with mono-layer gate dielectrics, and you can’t shrink atoms.

    Multi-core architectures are just a way to spend the increasing transistor budget in an energy efficient manner. If we started doubling the number of cores each generation without shrinking them, chips would soon be room-sized again. Multiple cores are not an alternative to shrinking transistors, but rather a consequence.

    Neither are silicon photonics: the need for fast interconnects on a chip is a consequence of the multi-core architecture, not an alternative to shrinking transistors.

  • Hugo

    It is not about gadgets. It is all about information. Gadgets just speed the information.

    Information equals power equals money equals information.

    Particularly watch the screws go down on information ownership as the speed of exchange goes up. There is already a booming black market on information.

    Look at it from this viewpoint to be able to imagine where the gadgets are all going.

    Historical records show it all; back to the creation of the campfire!

  • Robert Liley

    I share your opinion, Alasdair. In fact, I published an article about ‘The Cloud’ back in March. I am older than you and I’ve lived through a lot of the changes to which you refer. You can read my piece at blog.gssocx.com/2011/03/30/cloud-computing-the-more-things-change-the-more-they-remain-the-same/. While there have been occasional flurries of demand-driven computing, many of the changes we see have been vendor-driven, as vendors exploit opportunities to increase sales and revenues. The key question is, what increased value do we really expect to derive from ‘The Cloud’?

  • Robert Liley

    Alasdair, I also agree with much of what you say. Check out my post ‘Cloud Computing-the more things change, the more they remain the same’.