"O’Reilly Radar Podcast" entries

How the DevOps revolution informs software architecture

The O'Reilly Radar Podcast: Neal Ford on the changing role of software architects and the rise of microservices.

hans_christian_hansen,_architect_seier+seier_Flickr

In this episode of the Radar Podcast, O’Reilly’s Mac Slocum sits down with Neal Ford, a software architect and meme wrangler at ThoughtWorks, to talk about the changing role of software architects. They met up at our recent Software Architecture Conference in Boston — if you missed the event, you can sign up to be notified when the Complete Video Compilation of all sessions and talks is available.

Slocum started the conversation with the basics: what, exactly, does a software architect do. Ford noted that there’s not a straightforward answer, but that the role really is a “pastiche” of development, soft skills and negotiation, and solving business domain problems. He acknowledged that the role historically has been negatively perceived as a non-coding, post-useful, ivory tower deep thinker, but noted that has been changing over the past five to 10 years as the role has evolved into real-world problem solving, as opposed to operating in abstractions:

“One of the problems in software, I think, is that you build everything on towers of abstractions, and so it’s very easy to get to the point where all you’re doing is playing with abstractions, and you don’t reify that back to the real world, and I think that’s the danger of this kind of ivory-tower architect. When you start looking at things like continuous delivery and continuous deployment, you have to take those operational concerns into account, and I think that is making the role of architect a lot more relevant now, because they are becoming much more involved in the entire software development ecosystem, not just the front edge of it.”

Read more…

Comment: 1

Design’s return to artisan, at scale

The O'Reilly Radar Podcast: Matt Nish-Lapidus on design's circular evolution, and designing in the post-Industrial era.

Architectura_Ceiling_Natesh_Ramasamy_Flickr

In this week’s Radar Podcast episode, Jon Follett, editor of Designing for Emerging Technologies, chats with Matt Nish-Lapidus, partner and design director at Normative. Their discussion circles around the evolution of design, characteristics of post-Industrial design, and aesthetic intricacies of designing in networked systems. Also note, Nish-Lapidus will present a free webcast on these topics March 24, 2015.

Post-Industrial design relationships

Nish-Lapidus shares an interesting take on design evolution, from pre-Industrial to post-Industrial times, through the lens of eyeglasses. He uses eyeglasses as a case study, he says, because they’re a piece of technology that’s been used through a broad span of history, longer than many of the things we still use today. Nish-Lapidus walks us through the pre-Industrial era — so, Medieval times through about the 1800s — where a single craftsperson designed one product for a single individual; through the Industrial era, where mass-production took the main stage; to our modern post-Industrial era, where embedded personalization capabilities are bringing design almost full circle, back to a focus on the individual user:

“Once we move into this post-Industrial era, which we’re kind of entering now, the relationship’s starting to shift again, and glasses are a really interesting example. We go from having a single pair of glasses made for a single person, hand-made usually, to a pair of glasses designed and then mass-manufactured for a countless number of people, to having a pair of glasses that expresses a lot of different things. On one hand, you have something like Google Glass, which is still mass-produced, but the glasses actually contain embedded functionality. Then we also have, with the emergence of 3D printing and small-scale manufacturing, a return to a little bit of that artisan, one-to-one relationship, where you could get something that someone’s made just for you.

“These post-Industrial objects are more of an expression of the networked world in which we now live. We [again] have a way of building relationships with individual crafts-people. We also have objects that exist in the network themselves, as a physical instantiation of the networked environment that we live in.”

Read more…

Comment

Big data’s impact on global agriculture

The O'Reilly Radar Podcast: Stewart Collis talks about making precision farming accessible and affordable for all farmers.

inside_Martin_Fisc_Flickr

Stewart Collis, CTO and co-founder of AWhere, recently tweeted a link to a video by the University of Minnesota’s Institute on the Environment, Big Question: Feast or Famine? The video highlights the increasing complexity of feeding our rapidly growing population, and Collis noted its relation to his work at AWhere. I recently caught up with Collis to talk about our current global agriculture situation, the impact of big data on agriculture, and the work his company is doing to help address global agriculture problems.

The challenge, explained Collis, is two-fold: our growing population — expected to increase by another 2.4 billion people by 2050, and the increasing weather variability affecting our growing seasons and farmers’ abilities to produce and scale to accommodate that population. “In the face of weather variability, climate change, and increasing temperatures … farmers no longer know when it’s going to rain,” he said, and then noted: “There’s only 34 growing seasons between now and [2050], so this is a problem we need to solve now.”

Read more…

Comment: 1

Bridging the gap in big data silos

The O'Reilly Radar Podcast: John Carnahan on holistic data analysis, engagement channels, and data science as an art form.

Grain_Storage_Silos_Kool_Cats_Photography_Flickr

In this Radar Podcast episode, I sit down with John Carnahan, executive vice president of data science at Ticketmaster. At our recent Strata + Hadoop World Conference in San Jose, CA, Carnahan presented a session on using data science and machine learning to improve ticket sales and marketing at Ticketmaster.

I took the opportunity to chat with Carnahan about Ticketmaster’s evolving approach to data analysis, the avenues of user engagement they’re investigating, and how his genetics background is informing his work in the big data space.

When Carnahan took the job at Ticketmaster about three years ago, his strategy focused on small, concrete tasks aimed at solving distinct nagging problems: how do you address large numbers of tickets not sold at an event, how do you engage and market those undersold events to fans, and how do you stem abuse of ticket sales. This strategy has evolved, Carnahan explained, to a more holistic approach aimed at bridging the data silos within the company:

“We still want those concrete things, but we want to build a bed of data science assets that’s built on top of a company that’s been around almost 40 years and has a lot of data assets. How do we build the platform that will leverage those things into the future, beyond just those small niche products that we really want to build. We’re trying to bridge the gap between a lot of those products, too. Rather than think of each of those things as a vertical or a silo that’s trying to accomplish something, it’s how do you use something that you’ve built over here, over there to make that better?”

Read more…

Comment

Design to reflect human values

The O'Reilly Radar Podcast: Martin Charlier on industrial and interaction design, reflecting societal values, and unified visions.

Abstract_Reflections_Francisco_Antunes_Flickr

Editor’s note: Martin Charlier will present a session, Prototyping User Experiences for Connected Products, at the O’Reilly Solid Conference, June 23 to 25, 2015, in San Francisco. For more on the program and information on registration, visit the Solid website.

Designing for the Internet of Things is requiring designers and engineers to expand the boundaries of their traditionally defined roles. In this Radar Podcast episode, O’Reilly’s Mary Treseler sat down with Martin Charlier, an independent design consultant and co-founder at Rain Cloud, to discuss the future of interfaces and the increasing need to merge industrial and interaction design in era of the Internet of Things.

Charlier stressed the importance of embracing the symbiotic nature of interaction design and service design:

“How I got into Internet of Things is interesting. My degree from Ravensbourne was in a very progressive design course that looked at product interaction and service design as one course. For us, it was pretty natural to think of product or services in a very open way. Whether they are connected or not connected didn’t really matter too much because it was basically understanding that technology is there to build almost anything. It’s really about how you design with that mind.

“When I was working in industrial design, it became really clear for me how important that is. Specifically, I remember one project working on a built-in oven … In this project, we specifically couldn’t change how you would interact with it. The user interface was already defined, and our task was to define how it looked. It became clear to me that I don’t want to exclude any one area, and it feels really unnatural to design a product but only worry about what it looks like and let somebody else worry about how it’s operated, or vice versa. Products in today’s world, especially, need to be thought about from all of these angles. You can’t really design a coffee maker anymore without thinking about the service that it might plug into or the systems that it connects to. You have to think about all of these things at the same time.”

Read more…

Comment

More than a currency, bitcoin is an enabling technology

The O'Reilly Radar Podcast: Balaji Srinivasan on the bigger picture of bitcoin, liquid markets, and the future of regulation.

The promise of bitcoin and blockchain extends well beyond its potential disruption as a currency. In this Radar Podcast episode, Balaji Srinivasan, a general partner at Andreessen Horowitz, explains how bitcoin is an enabling technology and why it’s like the Internet, in that “bitcoin will do for value transfer what the Internet did for communication — make it programmable.” I met up with Srinivasan at our recent O’Reilly Radar Summit: Bitcoin & the Blockchain, where he was speaking — you can see his talk, and all the others from the event, in the complete video compilation now available.

Subscribe to the O’Reilly Radar Podcast

TuneIn, iTunes, SoundCloud, RSS

The bigger picture of bitcoin

More than just a digital currency, bitcoin can serve as an instigator for new markets. Srinivasan explained the potential for everything to become a liquid market:

“Bitcoin is a platform for programmable money, programmable interchange, or anything of value. That’s very general. People have probably heard at this point about how you can use a blockchain to trade — in theory — stocks, or houses, or other kinds of things, but programmable value transfer is even bigger than just trading things which we know already exist.

“One analogy I would give is in 1988, it was not possible to find information on anything instantly. Today, most of the time it is. From your iPhone or your Android phone, you can google pretty much anything. In the same way, I think what bitcoin is going to mean, is markets in everything. That is, everything will have a price on it — everything will be a liquid market. You’ll be able to buy and sell almost anything. Where today the fixed costs of setting up such a market is too high for anything other than things that are fairly valuable, tomorrow it’ll be possible for even images or things you would not even think of normally buying and selling.”

Read more…

Comments: 5

Democratizing biotech research

The O'Reilly Radar Podcast: DJ Kleinbaum on lab automation, virtual lab services, and tackling the challenges of reproducibility.

The convergence of software and hardware, and the growing ubiquitousness of the Internet of Things is affecting industry across the board, and biotech labs are no exception. For this Radar Podcast episode, I chatted with DJ Kleinbaum, co-founder of Emerald Therapeutics, about lab automation, the launch of Emerald Cloud Laboratory, and the problem of reproducibility.

Subscribe to the O’Reilly Radar Podcast

TuneIn, iTunes, SoundCloud, RSS

Kleinbaum and his co-founder Brian Frezza started Emerald Therapeutics to research cures for persistent viral infections. They didn’t set out to spin up a second company, but their efforts to automate their own lab processes proved so fruitful, they decided to launch a virtual lab-as-a-service business, Emerald Cloud Laboratory. Kleinbaum explained:

“When Brian and I started the company right out of graduate school, we had this platform anti-viral technology, which the company is still working on, but because we were two freshly minted nobody Ph.D.s, we were not going to be able to raise the traditional $20 or $30 million that platform plays raise in the biotech space.

“We knew that we had to be much more efficient with the money we were able to raise. Brian and I both have backgrounds in computer science. So, from the beginning, we were trying to automate every experiment that our scientists ran, such that every experiment was just push a button, walk away. It was all done with process automation and robotics. That way, our scientists would be able to be much more efficient than your average bench chemist or biologist at a biotech company.

“After building that system internally for three years, we looked at it and realized that every aspect of a life sciences laboratory had been encapsulated in both hardware and software, and that that was too valuable a tool to just keep internally at Emerald for our own research efforts. Around this time last year, we decided that we wanted to offer that as a service, that other scientists, companies, and researchers could use to run their experiments as well.” Read more…

Comment

A human-centered approach to data-driven design

The O'Reilly Radar Podcast: Arianna McClain on humanizing data-driven design, and Dirk Knemeyer on design in emerging tech.

This week on the O’Reilly Radar Podcast, O’Reilly’s Roger Magoulas talks with Arianna McClain, a senior hybrid design researcher at IDEO, about storytelling through data; the interdependent nature of qualitative and quantitative data; and the human-centered, data-driven design approach at IDEO.

Subscribe to the O’Reilly Radar Podcast

iTunes, SoundCloud, RSS

In their interview, Magoulas noted that in our research at O’Reilly, we’ve been talking a lot about the importance of the social science design element in getting the most out of data. McClain emphasized the importance of storytelling through data at IDEO and described IDEO’s human-centered approach to data-driven design:

“IDEO really believes in staying and remaining human-centered throughout the data journey. Starting off with, how might we measure something, how might we measure a behavior. We don’t sit in a room and come up with an algorithm or come up with a question. We start by talking to people. … We’re trying to build measures and survey questions to understand at scale how people make decisions. … IDEO remains data-driven to how we analyze and synthesize our findings. When we’re given a large data set, we don’t analyze it and write a report and give it to people and say, ‘This is the direction we think you should go.’

“Instead, we look at segmentations in the data, and stories in the data, and how the data clusters. Then we go back, and we try to find people who are representative of that cluster or that segmentation. The segmentations, again, are not based on demographic variables. They are based on needs and insights that we heard in our qualitative research. … What we’ve recognized is that something that seems so clear in the analysis is often very nuanced, and it can inform our design.”

Read more…

Comment

Bringing an end to synthetic biology’s semantic debate

The O'Reilly Radar Podcast: Tim Gardner on the synthetic biology landscape, lab automation, and the problem of reproducibility.

Editor’s note: this podcast is part of our investigation into synthetic biology and bioengineering. For more on these topics, download a free copy of the new edition of BioCoder, our quarterly publication covering the biological revolution. Free downloads for all past editions are also available.

Tim Gardner, founder of Riffyn, has recently been working with the Synthetic Biology Working Group of the European Commission Scientific Committees to define synthetic biology, assess the risk assessment methodologies, and then describe research areas. I caught up with Gardner for this Radar Podcast episode to talk about the synthetic biology landscape and issues in research and experimentation that he’s addressing at Riffyn.

Defining synthetic biology

Among the areas of investigation discussed at the EU’s Synthetic Biology Working Group was defining synthetic biology. The official definition reads: “SynBio is the application of science, technology and engineering to facilitate and accelerate the design, manufacture and/or modification of genetic materials in living organisms.” Gardner talked about the significance of the definition:

“The operative part there is the ‘design, manufacture, modification of genetic materials in living organisms.’ Biotechnologies that don’t involve genetic manipulation would not be considered synthetic biology, and more or less anything else that is manipulating genetic materials in living organisms is included. That’s important because it gets rid of this semantic debate of, ‘this is synthetic biology, that’s synthetic biology, this isn’t, that’s not,’ that often crops up when you have, say, a protein engineer talking to someone else who is working on gene circuits, and someone will claim the protein engineer is not a synthetic biologist because they’re not working with parts libraries or modularity or whatnot, and the boundaries between the two are almost indistinguishable from a practical standpoint. We’ve wrapped it all together and said, ‘It basically advances in the capabilities of genetic engineering. That’s what synthetic biology is.'”

Read more…

Comment

Security comes from evolution, not revolution

The O'Reilly Radar Podcast: Mike Belshe on making bitcoin secure and easy enough for the mainstream.

locks_Steven_Tom_Flickr

Editor’s note: you can subscribe to the O’Reilly Radar Podcast through iTunes, SoundCloud, or directly through our podcast’s RSS feed.

In this week’s O’Reilly Radar Podcast episode, I caught up with Mike Belshe, CTO and co-founder of BitGo, a company that has developed a multi-signature wallet that works with bitcoin. Belshe talks about about the security issues addressed by multi-signature wallets, how the technology works, and the challenges in bringing cryptocurrencies mainstream. We also talk about his journey into the bitcoin world, and he chimes in on what money will look like in the future. Belshe will address the topics of security and multi-signature technology at our upcoming Bitcoin & the Blockchain Radar Summit on January 27, 2015, in San Francisco — for more on the program and registration information, visit our Bitcoin & the Blockchain website.

Multi-signature technology is exactly what it sounds like: instead of authorizing bitcoin transactions with a single signature and a single key (the traditional method), it requires multiple signatures and/or multiple machines — and any combination thereof. The concept initially was developed as a solution for malware. Belshe explains:

“I’m fully convinced that the folks who have been writing various types of malware that steal fairly trivial identity information — logins and passwords that they sell super cheap — they are retooling their viruses, their scanners, their key loggers for bitcoin. We’ve seen evidence of that over the last 12 months, for sure. Without multi-signature, if you do a bitcoin transaction on a machine that’s got any of this bad stuff on it, you’re pretty much toast. Multi-signature was my hope to fix that. What we do is make one signature happen on the server machine, one signature happen on the client machine, your home machine. That way the attacker has to actually compromise two totally different systems in order to steal your bitcoin. That’s what multi-signature is about.”

Read more…

Comment