5 cloud computing conundrums

CIOs will need to unravel some challenging near-term puzzles to succeed in the cloud.

With all the attention being paid to both public and private cloud computing these days, it would be easy to believe that it offers a panacea for the woes of every CIO. If only! The reality of designing and implementing a cloud strategy, particularly the public component, is far more complex than any technology vendor or analyst paper would have you believe. Faced with an array of trade-offs, public cloud computing is creating considerable challenges for CIOs and their teams.

Like every new technology paradigm that has come before, cloud computing presents both clear advantages and near-term limitations that need to be addressed ( I deliberately say near-term, as IT innovation has a neat way of figuring stuff out eventually. Sadly, not always when you need it, and certainly not for the benefit of early adopters.)

With the C-suite continuing to apply pressure to get more value from IT and reduce cost, moving technology services into an externally hosted environment or subscribing to an online business solution can be a quick and convenient win. But can a strategy like this be applied successfully in a repeatable fashion without significant trade-offs?

While every business needs to consider public cloud computing in the context of its own needs and risk profile, I’ve identified a sample of puzzles that most CIOs will likely need to address. There are many others of course, but these should be sufficiently provocative.

Puzzle #1: Create flexibility by being less flexible

Moving capability to the cloud can provide clear advantages such as storage elasticity (the ability to increase or decrease needs as necessary and only pay for the amount used) and pay-per-feature options. But these flexibilities may come at the price of vendor lock-in and limiting feature sets. Will this compromise be acceptable? Difficulty level: Medium.

Puzzle #2: Determine the cost of an existing IT solution

Whether an IT service should remain internal or be hosted in the cloud requires a level of cost accounting (the true costs of labor, utilities, backups, disaster recovery etc.), which is seldom applied to the cost of running a technology service. This puzzle requires the CIO to understand and allocate the appropriate costs for each service being considered for the cloud. Hint: Don’t forget to include opportunity costs. Difficulty level: High.

Puzzle #3: Simplify the environment by introducing more complexity

Move a complex business process to a software-as-a-service (SaaS) provider and you immediately eliminate the complexity of developing, managing, and hosting the solution internally. However, move lots of processes to a variety of providers and you may introduce challenges in getting these applications to interface with each other. You also provide a considerably less unified experience to the user. While standard APIs ease the flow of data, supporting disparate vendor solutions adds a new level of complexity. Difficulty level: Medium.

Puzzle #4: Provide assurances of sustainability in a domain of uncertainty

Public cloud solutions remain largely nascent and unproven over the long term. With the benefits so compelling, it can be hard to resist moving forward with what may appear to be a great fit. With little ability to ensure that the solution will be available in the long-term, the challenge is to receive and provide assurances to already skeptical stakeholders. Difficulty level: High.

Puzzle #5: Maintain security while reducing it

Providing a secure computing environment is the priority of every CIO. With threats increasing and becoming ever more elaborate, this is a space with little room for error or oversight. By moving services to the cloud, you may essentially be outsourcing your security. Difficulty level: High.

One could assume from this posting that I’m not supportive of the movement to the public cloud. But nothing could be further from the truth. The opportunities for us — such as lower cost, increased agility, and new business possibilities — are obvious and compelling. We embrace innovation earlier than most, but we want to be smart about it. In fact, it is our own efforts at O’Reilly IT that have sparked these discussion points. Given what’s at stake, a deliberate and diligent approach is absolutely essential. It’s clearly not all or nothing. We’ll migrate only what makes sense. Since moving services to the public cloud is often a unidirectional process (they’re unlikely to move back in-house without significant cost and serious disruption) it’s important to avoid buyer’s remorse.

If you’ve solved some of these puzzles we’d love to hear how you did it and any trade-offs you had to make. We’re also interested in other conundrums that cloud computing presents.


tags: , ,
  • For #1, we’re trying to make this a non-issue by building OpenStack, an open source, open design platform. We have a growing community and hope other vendors will begin to deploy it, giving public cloud customers vendor options without having to change how they interact with their services. Much like how you can get a LAMP stack from anyone, we hope OpenStack can help provide the same standard for cloud services.

  • First of all, great article. Being in this business, I think you’re right on with your insights.

    Cloud Computing has a lot of pros and some challenges that the industry hasn’t fully addressed yet. Looking only at the Infrastructure component of cloud (memory, storage, raw compute power), the pros can be pretty compelling when compared to either deploying servers in your own enterprise, or utilizing dedicated hosting services. We (I work for Qwest Business) did a simple calculation that showed the cost of deploying ~10 virtual servers was less than the cost of just power and space for a rack in a dedicated environment. With the proper management portal, an enterprise can quickly deploy and manage servers as-needed with as much control as an in-house dedicated server, and without a lot of the hassles.

    The biggest challenge is that not every application will work in a virtualized environment, and it is important to test adequately that everything performs correctly.

    When a company is ready to move to a virtual private cloud, we feel the biggest challenge is to look at the physical infrastructure and then translate that into the virtual architecture. While there are many factors that influence this decision, some key parameters are: the utilization rate of physical and then ultimately how the virtual environment will be used; applications dependencies; dynamic provisioning and dynamic resource allocation.

    I’d love to hear what others have to say about this… hint hint, more comments J

    Beth – Qwest Business


  • I’ve not worked in many organisations, even so I doubt that there are many places where the CIO – when such a post exists – has ever had total control of technology choices. OK, if there are such places they’re still using mainframes and the work is called data processing.

    Developers will use cloud computing. In part they’ll use it because they can experiment on their own, or share, but it’s their choice. The cost of trying new things is low, the cost of abandoning and starting again is low.

    In some organisations the choice of developers to use “unsupported” technology will force them to move on. In others it will cause the organisation to change. We’ll soon know which is which, by the level of cloud support for the applications these organisations deliver.

    As for the “puzzles” is this a test? Sure technical problems are only ever “medium” whereas things drive get board level discussion are “high” – I’m a civil servant, we’re familiar with this “pay grade” problem. There’s a simple solution, always – seek improvement, not utopia.

  • Thank you for your comments which I read with interest. I appreciate the time you took to write your comments and they do influence my thinking.