FEATURED STORY

A human-centered approach to data-driven design

The O'Reilly Radar Podcast: Arianna McClain on humanizing data-driven design, and Dirk Knemeyer on design in emerging tech.

This week on the O’Reilly Radar Podcast, O’Reilly’s Roger Magoulas talks with Arianna McClain, a senior hybrid design researcher at IDEO, about storytelling through data; the interdependent nature of qualitative and quantitative data; and the human-centered, data-driven design approach at IDEO.

Subscribe to the O’Reilly Radar Podcast

iTunes, SoundCloud, RSS

In their interview, Magoulas noted that in our research at O’Reilly, we’ve been talking a lot about the importance of the social science design element in getting the most out of data. McClain emphasized the importance of storytelling through data at IDEO and described IDEO’s human-centered approach to data-driven design:

“IDEO really believes in staying and remaining human-centered throughout the data journey. Starting off with, how might we measure something, how might we measure a behavior. We don’t sit in a room and come up with an algorithm or come up with a question. We start by talking to people. … We’re trying to build measures and survey questions to understand at scale how people make decisions. … IDEO remains data-driven to how we analyze and synthesize our findings. When we’re given a large data set, we don’t analyze it and write a report and give it to people and say, ‘This is the direction we think you should go.’

“Instead, we look at segmentations in the data, and stories in the data, and how the data clusters. Then we go back, and we try to find people who are representative of that cluster or that segmentation. The segmentations, again, are not based on demographic variables. They are based on needs and insights that we heard in our qualitative research. … What we’ve recognized is that something that seems so clear in the analysis is often very nuanced, and it can inform our design.”

Read more…

Comment

Bitcoin is just the first app to use blockchain technology

Understanding the value of the blockchain above and beyond bitcoin.

square_Ken_Flickr

Editor’s note: Lorne Lantz is a program co-chair for our O’Reilly Radar Summit: Bitcoin & the Blockchain on January 27, 2015, in San Francisco. For more on the program and for registration information, visit the Bitcoin & the Blockchain event website.

I remember the first time I heard about bitcoin. It was June 2012, and I was invited to a bitcoin meetup. The whole time I was sitting there, I thought these were a bunch of computer geeks playing around with nerd money.

At the same time, I felt excited about the possibilities. If what the bitcoin believers were saying was true, it could become something very big. When I took a closer look, I realized why it could be so groundbreaking: decentralization.

Unlike other currencies and payment networks, bitcoin is not controlled by a bank, government, or financial institution. Instead, thousands of computers around the world verify transactions and manage a global decentralized ledger. This innovative technology is called the blockchain, and it provides a unique pathway that allows — for the first time — many computers that don’t trust each other to achieve consensus. In bitcoin’s case, they are achieving consensus on updates to the global ledger. Read more…

Comments: 6

Blockchain scalability

A look at the stumbling blocks to blockchain scalability and some high-level technical solutions.

Author note: Vitalik Buterin contributed to this article.

chain_Peter_Shanks_Flickr

Editor’s note: Kieren James-Lubin is a program co-chair for our O’Reilly Radar Summit: Bitcoin & the Blockchain on January 27, 2015, in San Francisco. For more on the program and for registration information, visit the Bitcoin & the Blockchain event website.

In a talk at CoinJar last fall, well-known bitcoin expert Andreas Antonopoulos made the following comment:

“I have no worries that bitcoin can scale, and the simple reason for that is that I know that IPv4 can’t, and yet I use it every day.”

The issue of bitcoin scalability and the phrase “blockchain scalability” are often seen in technical discussions of the bitcoin protocol. Will the requirements of recording every bitcoin transaction in the blockchain compromise its security (because fewer users will keep a copy of the whole blockchain) or its ability to handle a great number of transactions (because new blocks on which transactions can be recorded are only produced at limited intervals)? In this article, we’ll explore several meanings of “blockchain scalability” and some high-level technical solutions to the issue.

The three main stumbling blocks to blockchain scalability are:

  1. The tendency toward centralization with a growing blockchain: the larger the blockchain grows, the larger the requirements become for storage, bandwidth, and computational power that must be spent by “full nodes” in the network, leading to a risk of much higher centralization if the blockchain becomes large enough that only a few nodes are able to process a block.
  2. The bitcoin-specific issue that the blockchain has a built-in hard limit of 1 megabyte per block (about 10 minutes), and removing this limit requires a “hard fork” (ie. backward-incompatible change) to the bitcoin protocol.
  3. The high processing fees currently paid for bitcoin transactions, and the potential for those fees to increase as the network grows. We won’t discuss this too much, but see here for more detail.

We’ll consider these first two issues in detail. Read more…

Comments: 3

Bringing an end to synthetic biology’s semantic debate

The O'Reilly Radar Podcast: Tim Gardner on the synthetic biology landscape, lab automation, and the problem of reproducibility.

Editor’s note: this podcast is part of our investigation into synthetic biology and bioengineering. For more on these topics, download a free copy of the new edition of BioCoder, our quarterly publication covering the biological revolution. Free downloads for all past editions are also available.

Tim Gardner, founder of Riffyn, has recently been working with the Synthetic Biology Working Group of the European Commission Scientific Committees to define synthetic biology, assess the risk assessment methodologies, and then describe research areas. I caught up with Gardner for this Radar Podcast episode to talk about the synthetic biology landscape and issues in research and experimentation that he’s addressing at Riffyn.

Defining synthetic biology

Among the areas of investigation discussed at the EU’s Synthetic Biology Working Group was defining synthetic biology. The official definition reads: “SynBio is the application of science, technology and engineering to facilitate and accelerate the design, manufacture and/or modification of genetic materials in living organisms.” Gardner talked about the significance of the definition:

“The operative part there is the ‘design, manufacture, modification of genetic materials in living organisms.’ Biotechnologies that don’t involve genetic manipulation would not be considered synthetic biology, and more or less anything else that is manipulating genetic materials in living organisms is included. That’s important because it gets rid of this semantic debate of, ‘this is synthetic biology, that’s synthetic biology, this isn’t, that’s not,’ that often crops up when you have, say, a protein engineer talking to someone else who is working on gene circuits, and someone will claim the protein engineer is not a synthetic biologist because they’re not working with parts libraries or modularity or whatnot, and the boundaries between the two are almost indistinguishable from a practical standpoint. We’ve wrapped it all together and said, ‘It basically advances in the capabilities of genetic engineering. That’s what synthetic biology is.'”

Read more…

Comment

The 3Ps of the blockchain: platforms, programs and protocols

There is a burgeoning landscape around the blockchain’s decentralized consensus protocol technologies.

Bitcoin_chain_9179_BTC_Keychain_Flickr

Although it may be early to baptize new buzz lingo like “Blockchain as a Service” (BaaS) or “Blockchain as a Platform” (BaaP), there is a burgeoning landscape of various implementations and activity in and around the blockchain’s decentralized consensus protocol technologies.

I’ve already covered the blockchain’s sweet spot as a development platform in “Understanding the blockchain,” so it is no surprise that its landscape will be made up of platforms, protocols, and (smart) programs.

Breaking-up the bitcoin-blockchain paradigm

In a perfect world, we would have a single blockchain and a single cryptocurrency. But that doesn’t seem to be in the cards, whether it is technically feasible or not. Although wide-scale adoption and a critical mass of users aren’t there yet, the market is signaling for a diversification of choices, some based on the bitcoin currency and its blockchain protocol, and others not. Read more…

Comments: 8

Beyond lab folklore and mythology

What the future of science will look like if we’re bold enough to look beyond centuries-old models.

Chemistry_Set_Alejandro_Hernandez_Flickr

Editor’s note: this post is part of our ongoing investigation into synthetic biology and bioengineering. For more on these areas, download the latest free edition of BioCoder.

Over the last six months, I’ve had a number of conversations about lab practice. In one, Tim Gardner of Riffyn told me about a gene transformation experiment he did in grad school. As he was new to the lab, he asked two more experienced scientists for their protocol: one said it must be done exactly at 42 C for 45 seconds, the other said exactly 37 C for 90 seconds. When he ran the experiment, Tim discovered that the temperature actually didn’t matter much. A broad range of temperatures and times would work.

In an unrelated conversation, DJ Kleinbaum of Emerald Cloud Lab told me about students who would only use their “lucky machine” in their work. Why, given a choice of lab equipment, did one of two apparently identical machines give “good” results for a some experiment, while the other one didn’t? Nobody knew. Perhaps it is the tubing that connects the machine to the rest of the experiment; perhaps it is some valve somewhere; perhaps it is some quirk of the machine’s calibration.

The more people I talked to, the more stories I heard: labs where the experimental protocols weren’t written down, but were handed down from mentor to student. Labs where there was a shared common knowledge of how to do things, but where that shared culture never made it outside, not even to the lab down the hall. There’s no need to write it down or publish stuff that’s “obvious” or that “everyone knows.” As someone who is more familiar with literature than with biology labs, this behavior was immediately recognizable: we’re in the land of mythology, not science. Each lab has its own ritualized behavior that “works.” Whether it’s protocols, lucky machines, or common knowledge that’s picked up by every student in the lab (but which might not be the same from lab to lab), the process of doing science is an odd mixture of rigor and folklore. Everybody knows that you use 42 C for 45 seconds, but nobody really knows why. It’s just what you do.

Despite all of this, we’ve gotten fairly good at doing science. But to get even better, we have to go beyond mythology and folklore. And getting beyond folklore requires change: changes in how we record data, changes in how we describe experiments, and perhaps most importantly, changes in how we publish results. Read more…

Comments: 5