FEATURED STORY

How the DevOps revolution informs software architecture

The O'Reilly Radar Podcast: Neal Ford on the changing role of software architects and the rise of microservices.

hans_christian_hansen,_architect_seier+seier_Flickr

In this episode of the Radar Podcast, O’Reilly’s Mac Slocum sits down with Neal Ford, a software architect and meme wrangler at ThoughtWorks, to talk about the changing role of software architects. They met up at our recent Software Architecture Conference in Boston — if you missed the event, you can sign up to be notified when the Complete Video Compilation of all sessions and talks is available.

Slocum started the conversation with the basics: what, exactly, does a software architect do. Ford noted that there’s not a straightforward answer, but that the role really is a “pastiche” of development, soft skills and negotiation, and solving business domain problems. He acknowledged that the role historically has been negatively perceived as a non-coding, post-useful, ivory tower deep thinker, but noted that has been changing over the past five to 10 years as the role has evolved into real-world problem solving, as opposed to operating in abstractions:

“One of the problems in software, I think, is that you build everything on towers of abstractions, and so it’s very easy to get to the point where all you’re doing is playing with abstractions, and you don’t reify that back to the real world, and I think that’s the danger of this kind of ivory-tower architect. When you start looking at things like continuous delivery and continuous deployment, you have to take those operational concerns into account, and I think that is making the role of architect a lot more relevant now, because they are becoming much more involved in the entire software development ecosystem, not just the front edge of it.”

Read more…

Comment: 1

Design’s return to artisan, at scale

The O'Reilly Radar Podcast: Matt Nish-Lapidus on design's circular evolution, and designing in the post-Industrial era.

Architectura_Ceiling_Natesh_Ramasamy_Flickr

In this week’s Radar Podcast episode, Jon Follett, editor of Designing for Emerging Technologies, chats with Matt Nish-Lapidus, partner and design director at Normative. Their discussion circles around the evolution of design, characteristics of post-Industrial design, and aesthetic intricacies of designing in networked systems. Also note, Nish-Lapidus will present a free webcast on these topics March 24, 2015.

Post-Industrial design relationships

Nish-Lapidus shares an interesting take on design evolution, from pre-Industrial to post-Industrial times, through the lens of eyeglasses. He uses eyeglasses as a case study, he says, because they’re a piece of technology that’s been used through a broad span of history, longer than many of the things we still use today. Nish-Lapidus walks us through the pre-Industrial era — so, Medieval times through about the 1800s — where a single craftsperson designed one product for a single individual; through the Industrial era, where mass-production took the main stage; to our modern post-Industrial era, where embedded personalization capabilities are bringing design almost full circle, back to a focus on the individual user:

“Once we move into this post-Industrial era, which we’re kind of entering now, the relationship’s starting to shift again, and glasses are a really interesting example. We go from having a single pair of glasses made for a single person, hand-made usually, to a pair of glasses designed and then mass-manufactured for a countless number of people, to having a pair of glasses that expresses a lot of different things. On one hand, you have something like Google Glass, which is still mass-produced, but the glasses actually contain embedded functionality. Then we also have, with the emergence of 3D printing and small-scale manufacturing, a return to a little bit of that artisan, one-to-one relationship, where you could get something that someone’s made just for you.

“These post-Industrial objects are more of an expression of the networked world in which we now live. We [again] have a way of building relationships with individual crafts-people. We also have objects that exist in the network themselves, as a physical instantiation of the networked environment that we live in.”

Read more…

Comment

Big data’s impact on global agriculture

The O'Reilly Radar Podcast: Stewart Collis talks about making precision farming accessible and affordable for all farmers.

inside_Martin_Fisc_Flickr

Stewart Collis, CTO and co-founder of AWhere, recently tweeted a link to a video by the University of Minnesota’s Institute on the Environment, Big Question: Feast or Famine? The video highlights the increasing complexity of feeding our rapidly growing population, and Collis noted its relation to his work at AWhere. I recently caught up with Collis to talk about our current global agriculture situation, the impact of big data on agriculture, and the work his company is doing to help address global agriculture problems.

The challenge, explained Collis, is two-fold: our growing population — expected to increase by another 2.4 billion people by 2050, and the increasing weather variability affecting our growing seasons and farmers’ abilities to produce and scale to accommodate that population. “In the face of weather variability, climate change, and increasing temperatures … farmers no longer know when it’s going to rain,” he said, and then noted: “There’s only 34 growing seasons between now and [2050], so this is a problem we need to solve now.”

Read more…

Comment: 1

Public vs. private cloud: Price isn’t enough

The risk relative to the savings isn’t enough to justify a shift to public cloud.

Servers_Paul_Hammond_Flickr

This post was originally published on Limn This. The lightly edited version that follows is republished with permission.

Last October, Simon Wardley and I stood on a rainy sidewalk at 28th St. in New York City arguing politely (he’s British) about the future of cloud adoption. He argued, rightly, that the cost advantages from scale would be overwhelming compared to home-brew private clouds. He went on to argue, less certainly in my view, that this would lead inevitably to their wholesale and deep adoption across the enterprise market.

I think Simon bases his argument on something like the rational economic man theory of the enterprise. Or, more specifically, the rational economic chief financial officer (CFO). If the costs of a service provider are destined to be lower than the costs of internally operated alternatives, and your CFO is rational (most tend to be), then the conclusion is foregone.

And, of course, costs are going down just as they are predicted to. Look at this post by Avi Deitcher: Does Amazon’s Web Services Pricing Follow Moore’s Law? I think the question posed in the title has a fairly obvious answer. No. Services aren’t just silicon; they include all manner of linear terms, like labor, so the price decreases will almost certainly be slower than Moore’s Law, but his analysis of the costs of a modestly-sized AWS solution and in-house competition is really useful.

Not only is AWS’ price dropping fast (56% in three years), but it’s significantly cheaper than building and operating a platform in house. Avi does the math for 600 instances over three years and finds that the cost for AWS would be $1.1 million (I don’t think this number considers out-year price decreases) versus $2.3 million for DIY. Your mileage might vary, but these numbers are a nice starting point for further discussion.

These results raise an interesting question: if the numbers are so compelling, why did Walmart just reveal that they are building a ginormous private cloud? Why would anyone? Read more…

Comments: 10

Bridging the gap in big data silos

The O'Reilly Radar Podcast: John Carnahan on holistic data analysis, engagement channels, and data science as an art form.

Grain_Storage_Silos_Kool_Cats_Photography_Flickr

In this Radar Podcast episode, I sit down with John Carnahan, executive vice president of data science at Ticketmaster. At our recent Strata + Hadoop World Conference in San Jose, CA, Carnahan presented a session on using data science and machine learning to improve ticket sales and marketing at Ticketmaster.

I took the opportunity to chat with Carnahan about Ticketmaster’s evolving approach to data analysis, the avenues of user engagement they’re investigating, and how his genetics background is informing his work in the big data space.

When Carnahan took the job at Ticketmaster about three years ago, his strategy focused on small, concrete tasks aimed at solving distinct nagging problems: how do you address large numbers of tickets not sold at an event, how do you engage and market those undersold events to fans, and how do you stem abuse of ticket sales. This strategy has evolved, Carnahan explained, to a more holistic approach aimed at bridging the data silos within the company:

“We still want those concrete things, but we want to build a bed of data science assets that’s built on top of a company that’s been around almost 40 years and has a lot of data assets. How do we build the platform that will leverage those things into the future, beyond just those small niche products that we really want to build. We’re trying to bridge the gap between a lot of those products, too. Rather than think of each of those things as a vertical or a silo that’s trying to accomplish something, it’s how do you use something that you’ve built over here, over there to make that better?”

Read more…

Comment: 1

Year Zero: Our life timelines begin

In the next decade, Year Zero will be how big data reaches everyone and will fundamentally change how we live.

sneaky_peek_Heart_of_Oak

Editor’s note: this post originally appeared on the author’s blog, Solve for Interesting. This lightly edited version is reprinted here with permission.

In 10 years, every human connected to the Internet will have a timeline. It will contain everything we’ve done since we started recording, and it will be the primary tool with which we administer our lives. This will fundamentally change how we live, love, work, and play. And we’ll look back at the time before our feed started — before Year Zero — as a huge, unknowable black hole.

This timeline — beginning for newborns at Year Zero — will be so intrinsic to life that it will quickly be taken for granted. Those without a timeline will be at a huge disadvantage. Those with a good one will have the tricks of a modern mentalist: perfect recall, suggestions for how to curry favor, ease maintaining friendships and influencing strangers, unthinkably higher Dunbar numbers — now, every interaction has a history.

This isn’t just about lifelogging health data, like your Fitbit or Jawbone. It isn’t about financial data, like Mint. It isn’t just your social graph or photo feed. It isn’t about commuting data like Waze or Maps. It’s about all of these, together, along with the tools and user interfaces and agents to make sense of it.

Every decade or so, something from military or enterprise technology finds its way, bent and twisted, into the mass market. The client-server computer gave us the PC; wide-area networks gave us the consumer web; pagers and cell phones gave us mobile devices. In the next decade, Year Zero will be how big data reaches everyone. Read more…

Comments: 11