Cloud computing saves L.A. millions in IT costs

Los Angeles CTO Randi Levin on why her city moved into Google's cloud.

Why did the City of Angels move to the cloud? Given that the Los Angeles city budget was constrained by the Great Recession, the driver was simple: cost savings. When the reality of a legacy IT infrastructure that couldn’t meet the needs of an increasingly mobile workforce was added to that driver, LA’s chief technology officer Randi Levin put out a request for proposal for the cloud.

“We’ve been cutting services,” she said. “We are now completely in what I would call ‘survival mode.’ Two or three years ago, we had a $116 million expense budget. Next year, it’ll be $80 million. It might even be less.”

After due diligence, Levin said her team recommended that the City of Los Angeles implement Google. “We predicted — and are on track — to save about $5 million over the next three years,” said Levin. “Those are hard dollar savings.”

Levin is looking ahead to where she can best use her staff. What are the important tasks? “It’s not running servers,” said Levin. “The city will get much more bang for their buck if we’re developing applications and websites, or making processes more efficient. As a general rule, we’re going to start moving out of that business and letting somebody else do it who can do it more efficiently.”

Tim O’Reilly talked with Levin at the recent Gov 2.0 Expo in Washington, D.C. Their discussion is embedded below:

After the jump, read more about Los Angeles’ decision to move into Google’s cloud.

Google’s contract with the Los Angeles for cloud computing services is notable in a number of ways, not least in that it’s available to the public for perusal. For excellent, exhaustive analysis of what’s in Google’s SaaS contract with the City of Los Angeles, make sure to read the Information Law Group’s series. Part two of their analysis of Google’s SaaS contract with LA is now online, in addition to the contact itself, which is embedded below. After the embed, you’ll find my interview with Levin.


Interview with Randi Levin, Los Angeles CTO

How has Los Angeles approached the process of IT transformation with yawning deficits and pressure to innovate?

I’ve been here three years now. When I got here, I made the rounds and interviewed everyone. We really started looking at our whole portfolio of work and saying if we’re trending downward, what techniques can we use to survive here? When you go from closer to $120 million down to $80 million in budget, how do we survive?

How did you approach the Request for Proposal (RFP) for cloud computing services?

The unanimous opinion of everyone was that they hated the email system, Novell GroupWise Version 7, which we hadn’t upgraded. That was partially the city’s fault. The mailbox size was small, and people had to archive or delete on a regular basis, which was a complaint. The calendar function wasn’t very good. When the iPhone came out, it didn’t work on the iPhone.

We decided that we would release an RFP and ask for either hosted on-site, hosted off-site or SaaS. We didn’t want to preclude putting it on-site and having somebody else run it, versus our own staff. We knew that the same old traditional model wasn’t going to work: “We buy it; we install it; we run it.” We didn’t have the money and we didn’t have the resources. On top of that, when you have a heavily unionized workforce where you can’t pick and choose your employees, you can’t go out to the open market and hire people at the levels that you need them. It’s very difficult to move forward, in any area of technology.

Our RFP got back 15 responses. We established a committee of folks from the different departments to help look at the responses and narrow it down. We had everybody bid, including Lotus Notes, Yahoo Mail, Hotmail, Google and Microsoft. A couple of them were partnered with different implementation partners. It was narrowed down to one Microsoft hosted option and two Google options, with two different implementation partners.

We started diving into the products and we started diving into the economics. The economics of the Google cloud solution were so much better than the economics of a hosted version that we said, “We can’t afford this.” We knew that our budget would be less the next year. After doing our due diligence, we decided as a cross-team to recommend that we implement Google.”

How were those economic drivers measured?

We compared the cost of the products that were involved in running our current email, including maintenance costs, the cost of the machines, and the staff. They were all hard dollar savings. There was nothing about productivity. We weren’t comparing an apple to an apple here. We were getting a lot more functionality with Google than we were with the old system.

We did two analyses: one that was cash-based, and one that was ROI-based. The city wouldn’t accept the ROI analysis because when you’re so cash constrained, you only want to look at cash. It doesn’t mean that we won’t achieve some of the ROI that we stated, but in terms of what people have seen publicly, we’re going to save $5 million.

The challenge for government, as with heavily regulated industries, is when data moves out of your hands. How do you assess whether it’s being stored properly, or securely enough? Who audits that? Who certifies it? How often does it get checked?

I think part of what people need to realize is that anybody can put anything in any email system. People lose sight of that. If you’re not supposed to put a certain piece of data in an email, for whatever reason, whether it’s public safety or HIPAA or whatever, there’s nothing to prevent anybody from doing that.

When people get that argument mixed up with SAS 70, it’s not the right argument. The argument is: does your company, does your government, does your organization have a policy? Do people adhere to the policy? What are the ramifications if you don’t? Because somebody could put something personal in email and send it to Taiwan.

That goes in my “lessons learned” category from doing the implementation here. There’s a feeling that if your data is stored on-site, your data is under your desk. I call it the “hug a server concept.” It’s like the people that used to put the money under their mattress.

How are you addressing security concerns associated with cloud computing?

When you look at all of the various ways to encrypt and store with a lot of these companies, it’s getting much more sophisticated than it used to be. Just storing it on your own server in your own data center? Please don’t tell me that that’s very secure. There’s a myth about that, that you have more control. Some organizations do it a lot better than others. If you’re looking at a whole cross-section of industries and governments, in general, people don’t do it that well.

When you look at the hard numbers of cost, you enter into a useful discussion around risk avoidance versus risk management. What are the opportunity costs here? What is the potential to be able to do a lot more, versus the potential for a significant data breach?

The hard thing about this subject is that most people can’t go on the record to discuss this. The problem is, because of that, you have a lot more myths about the security of people’s existing environments. The discussion’s always about the security of the SaaS provider, or Google or Microsoft. You can’t have a candid conversation about what you had or have, versus what you’re going to have. Any entity that does that actually opens the door in that period of time to somebody to come in and hack them more. It’s a tricky subject.

What improvements do you expect to realize through this move to the cloud?

I want us to manage somebody else managing servers because we’re in such a severe financial crisis that I don’t want my staff having to worry about any infrastructure. I don’t want to spend any time worrying about virtualizing servers. I want to make sure I have good performance and enough disk storage.

Nobody will end up with a completely SaaS model, at least not in the next couple of years, but what I do see is that most organizations are going to end up in a hybrid world where you have some on-site infrastructure, you have some hosted infrastructure, and you have some SaaS, simply because of the fact that everybody’s trying to do more with less.

There may be certain applications that you feel you need to keep onsite and that you need to run them yourselves. I’m sure there’s good reasons for that, but there’s a lot that you don’t have to. I think that many CIOs are starting to say: “If had 100 people, great. But I only have 80. I want to do the most value-add for the 80 and not running machines for the 20.”

Based upon this experience, what advice would you give to other IT staff and municipalities?

Do your own due diligence, first of all. You can talk to people like me, but do your own due diligence. Make sure you know your requirements. We thought we knew the requirements and relied on people that we thought knew them, but in some cases they didn’t know what they were. Make sure that you have a contract that has service levels. SAS 70 contracts are new, but they are out there. Make sure that you also are comparing your current service levels.

Gov 2.0 Expo: Cloud Computing in Los Angeles

At last month’s Gov 2.0 Expo, Tim O’Reilly talked with Levin and Dave Girouard, Google Enterprise president, about Los Angeles’ adoption of cloud computing. Video of their conversation is embedded below:

tags: , , , ,

Comments are closed.