It’s been a debate within organizations as long as I can remember: whether it’s possible to support a workforce that has the choice to use their own computers to perform their work. Recently the discussion has reached new levels of excitement as some big name organizations have initiated pilot programs. For IT leaders it’s a prospect that’s both compelling and daunting.
Technology developments over the years have made software more hardware agnostic, such as the introduction of the web browser and Java. Personal computers have largely become commodity items and their reliability has significantly improved. Yet, despite these events, bringing your own computer (BYOC) to work has remained an elusive goal.
Why bring your own computer to work?
From an IT leader’s perspective, the reasons for supporting BYOC are pretty clear. In an environment where CEOs want more of the organization’s dollars assigned to value-creating investments and innovation, the ongoing cost of asset management continues to be an unfortunate overhead. From procurement and assignment to repairs and disposal, managing large numbers of personal computers represents a significant dollar amount on a CIO’s budget.
The second driver is the desire of employees to use the equipment they are most comfortable with to do their jobs. We know that for most, a personal computer is not simply a black box. From wallpaper to icon positions, a computer often represents an extension of the individual. If anyone needs more convincing, just try and pry an Apple computer away from its user and replace it with a Windows machine (and vice versa). People have preferences. Enterprise-provided computers are a reluctantly accepted reality.
Why can’t we bring our own computers to work?
With these compelling reasons and more supporting BYOC, why has it not happened? The first reason that comes to mind for most IT leaders is the nightmare of trying to support hardware from a myriad of vendors. It flies in the face of standardization, which largely helps to keep costs and complexity down. In addition, organizations have continued to build solutions that rely on specific software and hardware requirements and configurations. Finally, there is both a real and perceived loss of control that makes most security and risk professionals shudder.
With all that said, there are now some substantive reasons to believe BYOC may soon become a reality for many organizations.
Times they are a changing
[Many of you can skip this brief history recap] When the web browser emerged in the 1990s, there was some optimism that it would herald the beginning of a world where software would largely become hardware agnostic. Many believed it would make the operating system (OS) largely irrelevant. Of course we know this didn’t happen, and software vendors continued to build OS-dependent solutions and organizations recommitted to large-scale, in-house ERP implementations that created vendor lock-ins. At the time, browser technology was inadequate, hosted enterprise applications were weak and often absent for many business functions, and broadband was expensive, inconsistent, and often unreliable across the U.S.
Skip forward and the situation is markedly different. Today we have robust browsers and supporting languages, reliable broadband, and enterprise-class applications that are delivered from hosted providers. It’s also not uncommon anymore for staff to use non-business provided, cloud-based consumer applications to perform their work.
Oh to be a start-up! If we could all redo our businesses today, we’d likely avoid building our own data centers and most of our applications. This is one of the promises of cloud computing. And while there will be considerable switching costs for existing organizations, the trend suggests a future where major business functions that are provided by technology will largely be non-competitive, on-demand utilities. In this future state it’s entirely possible that hardware independence will become a viable reality. With the application, data, business logic, and security all provisioned in the cloud, the computer really does simply become a portal to information and utility.
Smartphones are already a “bring your own computer” to work device
The smartphone demonstrates all the characteristics of the cloud-provisioned services I’ve discussed. In many organizations bringing your own smartphone to work is standard practice. Often the employee purchases the device, gets vendor support, and pays for the service themselves (a large number of organizations reimburse the service cost). It’s a model that may be emulated with personal computers. (That is, if smartphones don’t evolve to become the personal computer. That’s another possible outcome.)
I believe fully-embraced cloud computing makes BYOC entirely possible. There will continue to be resistance and indeed, there will be industries where security and control is so inflexible, that BYOC will be difficult to attain. There will also be cultural issues. We’ll need to overcome the notion that providing a computer is an organizational responsibility. There was a time when most organizations provided sales-people with cars (some still do). Today we expect employees to provide and maintain their own cars, but we do provide mileage reimbursement when it’s used for business purposes. Could there be a similar model for employees who use their own computers? Today, for BYOC, some enterprises simply provide a stipend. What works and what doesn’t will need to be figured out.
So what now?
So what are the takeaways from all of this? First, BYOC is a real likelihood for many organizations and it’s time for IT leadership to grapple with the implications. Second, the emergence of cloud computing will have unanticipated downstream impacts in organizations and strategies to address those issues will need to be created. Lastly, we’ve already entered into a slow and painful convergence between smartphones, personal computers, consumer applications and devices, and cloud computing. This needs to be reconciled appropriate to each industry and organization. And it has to happen sooner than later.
When the dust settles, the provision of computing services in the enterprise will be entirely different. IT leadership had better be prepared.