Jenn Webb

Jenn Webb is a veteran of the newspaper industry turned freelance scribe, editor, and researcher. She is a nerd with a passion for technology and cultural disruption. She currently serves as O'Reilly Radar's managing editor and helps to investigate topics in the Design, IoT+, Data, and Emerging Tech spaces.

Design to reflect human values

The O'Reilly Radar Podcast: Martin Charlier on industrial and interaction design, reflecting societal values, and unified visions.

Abstract_Reflections_Francisco_Antunes_Flickr

Designing for the Internet of Things is requiring designers and engineers to expand the boundaries of their traditionally defined roles. In this Radar Podcast episode, O’Reilly’s Mary Treseler sat down with Martin Charlier, an independent design consultant and co-founder at raincloud.eu, to discuss the future of interfaces and the increasing need to merge industrial and interaction design in era of the Internet of Things.

Charlier stressed the importance of embracing the symbiotic nature of interaction design and service design:

“How I got into Internet of Things is interesting. My degree from Ravensbourne was in a very progressive design course that looked at product interaction and service design as one course. For us, it was pretty natural to think of product or services in a very open way. Whether they are connected or not connected didn’t really matter too much because it was basically understanding that technology is there to build almost anything. It’s really about how you design with that mind.

“When I was working in industrial design, it became really clear for me how important that is. Specifically, I remember one project working on a built-in oven … In this project, we specifically couldn’t change how you would interact with it. The user interface was already defined, and our task was to define how it looked. It became clear to me that I don’t want to exclude any one area, and it feels really unnatural to design a product but only worry about what it looks like and let somebody else worry about how it’s operated, or vice versa. Products in today’s world, especially, need to be thought about from all of these angles. You can’t really design a coffee maker anymore without thinking about the service that it might plug into or the systems that it connects to. You have to think about all of these things at the same time.”

Read more…

Comment

An alternate perspective on data-driven decision making

The O'Reilly Radar Podcast: Tricia Wang on "thick data," purpose-driven problem solving, and building the ideal team.

In this week’s Radar Podcast episode, O’Reilly’s Roger Magoulas chatted with Tricia Wang, a global tech ethnographer and co-founder of PL Data, about how qualitative and quantitative data need to work together, reframing “data-driven decision making,” and building the ideal team.

Subscribe to the O’Reilly Radar Podcast

TuneIn, iTunes, SoundCloud, RSS

Purpose-driven problem solving

Wang stressed that quantitative and qualitative need to work together. Rather than focusing on data-driven decision making, we need to focus on the best way to identify and solve the problem at hand: the data alone won’t provide the answers:

“It’s been kind of a detriment to our field that there’s this phrase ‘data-driven decision making.’ I think oftentimes people expect that the data’s going to give you answers. Data does not give you answers; it gives you inputs. You still have to figure out how to do the translation work and figure out what the data is trying to explain, right? I think data-driven decision making does not accurately describe what data can do. Really what we should be talking about is purpose-driven problem solving with data. Read more…

Comment: 1

Signals from Strata + Hadoop World in San Jose, CA, 2015

From data-driven government to our age of intelligence, here are key insights from Strata + Hadoop World in San Jose, CA, 2015.

Experts from across the big data world came together for Strata + Hadoop World in San Jose, CA, 2015. We’ve gathered insights from the event below.

U.S. chief data scientist

With a special recorded introduction from President Barack Obama, DJ Patil talks about his new role as the U.S. government’s first ever chief data scientist, the nature of the U.S.’s emerging data-driven government, and defines his mission in leading the data-driven initiative:

“Responsibly unleash the power of data for the benefit of the American public and maximize the nation’s return on its investment in data.”


Read more…

Comment: 1

Experience Design Links and Fodder: February 13, 2015

Mentor relationships; "learninghomes"; and building cross-disciplinary, collaborative teams.

Each week our design editors curate the most notable, interesting, and important material they come across. Below you’ll find their recent selections. You can get these and more in our weekly Design Newsletter.

Are you my mentor?

Digital product designer Lane Halley shares 7 Tips for Finding Your Perfect UX Mentor, and she describes how to establish a productive relationship once you’ve found one. Here’s how to bring that special someone into your life.

Shadows_Atilla_Kefeli_FlickrSource: Cropped image by Atilla Kefeli on Flickr Read more…

Comment

More than a currency, bitcoin is an enabling technology

The O'Reilly Radar Podcast: Balaji Srinivasan on the bigger picture of bitcoin, liquid markets, and the future of regulation.

The promise of bitcoin and blockchain extends well beyond its potential disruption as a currency. In this Radar Podcast episode, Balaji Srinivasan, a general partner at Andreessen Horowitz, explains how bitcoin is an enabling technology and why it’s like the Internet, in that “bitcoin will do for value transfer what the Internet did for communication — make it programmable.” I met up with Srinivasan at our recent O’Reilly Radar Summit: Bitcoin & the Blockchain, where he was speaking — you can see his talk, and all the others from the event, in the complete video compilation now available.

Subscribe to the O’Reilly Radar Podcast

TuneIn, iTunes, SoundCloud, RSS

The bigger picture of bitcoin

More than just a digital currency, bitcoin can serve as an instigator for new markets. Srinivasan explained the potential for everything to become a liquid market:

“Bitcoin is a platform for programmable money, programmable interchange, or anything of value. That’s very general. People have probably heard at this point about how you can use a blockchain to trade — in theory — stocks, or houses, or other kinds of things, but programmable value transfer is even bigger than just trading things which we know already exist.

“One analogy I would give is in 1988, it was not possible to find information on anything instantly. Today, most of the time it is. From your iPhone or your Android phone, you can google pretty much anything. In the same way, I think what bitcoin is going to mean, is markets in everything. That is, everything will have a price on it — everything will be a liquid market. You’ll be able to buy and sell almost anything. Where today the fixed costs of setting up such a market is too high for anything other than things that are fairly valuable, tomorrow it’ll be possible for even images or things you would not even think of normally buying and selling.”

Read more…

Comments: 5

Experience Design Links and Fodder: February 6, 2015

On-demand focus groups; the bond between ux, data, and design

Each week our design editors curate the most notable, interesting, and important material they come across. Below you’ll find their recent selections. You can get these and more in our weekly Design Newsletter.

Focus groups on demand

User experience start-up UserTesting has modernized the old-fashioned focus group: “it runs an online panel of more than one million testers…who can test products and other company materials on demand.” Current clients include the likes of Google, Facebook, Home Depot, Verizon Wireless, and Amazon — UserTesting just might be onto something.

Survey_Form_MCLK-Travel_Flickr
Source: Cropped image by MCLK Travel on Flickr Read more…

Comment

Democratizing biotech research

The O'Reilly Radar Podcast: DJ Kleinbaum on lab automation, virtual lab services, and tackling the challenges of reproducibility.

The convergence of software and hardware, and the growing ubiquitousness of the Internet of Things is affecting industry across the board, and biotech labs are no exception. For this Radar Podcast episode, I chatted with DJ Kleinbaum, co-founder of Emerald Therapeutics, about lab automation, the launch of Emerald Cloud Laboratory, and the problem of reproducibility.

Subscribe to the O’Reilly Radar Podcast

TuneIn, iTunes, SoundCloud, RSS

Kleinbaum and his co-founder Brian Frezza started Emerald Therapeutics to research cures for persistent viral infections. They didn’t set out to spin up a second company, but their efforts to automate their own lab processes proved so fruitful, they decided to launch a virtual lab-as-a-service business, Emerald Cloud Laboratory. Kleinbaum explained:

“When Brian and I started the company right out of graduate school, we had this platform anti-viral technology, which the company is still working on, but because we were two freshly minted nobody Ph.D.s, we were not going to be able to raise the traditional $20 or $30 million that platform plays raise in the biotech space.

“We knew that we had to be much more efficient with the money we were able to raise. Brian and I both have backgrounds in computer science. So, from the beginning, we were trying to automate every experiment that our scientists ran, such that every experiment was just push a button, walk away. It was all done with process automation and robotics. That way, our scientists would be able to be much more efficient than your average bench chemist or biologist at a biotech company.

“After building that system internally for three years, we looked at it and realized that every aspect of a life sciences laboratory had been encapsulated in both hardware and software, and that that was too valuable a tool to just keep internally at Emerald for our own research efforts. Around this time last year, we decided that we wanted to offer that as a service, that other scientists, companies, and researchers could use to run their experiments as well.” Read more…

Comment

A human-centered approach to data-driven design

The O'Reilly Radar Podcast: Arianna McClain on humanizing data-driven design, and Dirk Knemeyer on design in emerging tech.

This week on the O’Reilly Radar Podcast, O’Reilly’s Roger Magoulas talks with Arianna McClain, a senior hybrid design researcher at IDEO, about storytelling through data; the interdependent nature of qualitative and quantitative data; and the human-centered, data-driven design approach at IDEO.

Subscribe to the O’Reilly Radar Podcast

iTunes, SoundCloud, RSS

In their interview, Magoulas noted that in our research at O’Reilly, we’ve been talking a lot about the importance of the social science design element in getting the most out of data. McClain emphasized the importance of storytelling through data at IDEO and described IDEO’s human-centered approach to data-driven design:

“IDEO really believes in staying and remaining human-centered throughout the data journey. Starting off with, how might we measure something, how might we measure a behavior. We don’t sit in a room and come up with an algorithm or come up with a question. We start by talking to people. … We’re trying to build measures and survey questions to understand at scale how people make decisions. … IDEO remains data-driven to how we analyze and synthesize our findings. When we’re given a large data set, we don’t analyze it and write a report and give it to people and say, ‘This is the direction we think you should go.’

“Instead, we look at segmentations in the data, and stories in the data, and how the data clusters. Then we go back, and we try to find people who are representative of that cluster or that segmentation. The segmentations, again, are not based on demographic variables. They are based on needs and insights that we heard in our qualitative research. … What we’ve recognized is that something that seems so clear in the analysis is often very nuanced, and it can inform our design.”

Read more…

Comment

Bringing an end to synthetic biology’s semantic debate

The O'Reilly Radar Podcast: Tim Gardner on the synthetic biology landscape, lab automation, and the problem of reproducibility.

Editor’s note: this podcast is part of our investigation into synthetic biology and bioengineering. For more on these topics, download a free copy of the new edition of BioCoder, our quarterly publication covering the biological revolution. Free downloads for all past editions are also available.

Tim Gardner, founder of Riffyn, has recently been working with the Synthetic Biology Working Group of the European Commission Scientific Committees to define synthetic biology, assess the risk assessment methodologies, and then describe research areas. I caught up with Gardner for this Radar Podcast episode to talk about the synthetic biology landscape and issues in research and experimentation that he’s addressing at Riffyn.

Defining synthetic biology

Among the areas of investigation discussed at the EU’s Synthetic Biology Working Group was defining synthetic biology. The official definition reads: “SynBio is the application of science, technology and engineering to facilitate and accelerate the design, manufacture and/or modification of genetic materials in living organisms.” Gardner talked about the significance of the definition:

“The operative part there is the ‘design, manufacture, modification of genetic materials in living organisms.’ Biotechnologies that don’t involve genetic manipulation would not be considered synthetic biology, and more or less anything else that is manipulating genetic materials in living organisms is included. That’s important because it gets rid of this semantic debate of, ‘this is synthetic biology, that’s synthetic biology, this isn’t, that’s not,’ that often crops up when you have, say, a protein engineer talking to someone else who is working on gene circuits, and someone will claim the protein engineer is not a synthetic biologist because they’re not working with parts libraries or modularity or whatnot, and the boundaries between the two are almost indistinguishable from a practical standpoint. We’ve wrapped it all together and said, ‘It basically advances in the capabilities of genetic engineering. That’s what synthetic biology is.'”

Read more…

Comment

Security comes from evolution, not revolution

The O'Reilly Radar Podcast: Mike Belshe on making bitcoin secure and easy enough for the mainstream.

locks_Steven_Tom_Flickr

Editor’s note: you can subscribe to the O’Reilly Radar Podcast through iTunes, SoundCloud, or directly through our podcast’s RSS feed.

In this week’s O’Reilly Radar Podcast episode, I caught up with Mike Belshe, CTO and co-founder of BitGo, a company that has developed a multi-signature wallet that works with bitcoin. Belshe talks about about the security issues addressed by multi-signature wallets, how the technology works, and the challenges in bringing cryptocurrencies mainstream. We also talk about his journey into the bitcoin world, and he chimes in on what money will look like in the future. Belshe will address the topics of security and multi-signature technology at our upcoming Bitcoin & the Blockchain Radar Summit on January 27, 2015, in San Francisco — for more on the program and registration information, visit our Bitcoin & the Blockchain website.

Multi-signature technology is exactly what it sounds like: instead of authorizing bitcoin transactions with a single signature and a single key (the traditional method), it requires multiple signatures and/or multiple machines — and any combination thereof. The concept initially was developed as a solution for malware. Belshe explains:

“I’m fully convinced that the folks who have been writing various types of malware that steal fairly trivial identity information — logins and passwords that they sell super cheap — they are retooling their viruses, their scanners, their key loggers for bitcoin. We’ve seen evidence of that over the last 12 months, for sure. Without multi-signature, if you do a bitcoin transaction on a machine that’s got any of this bad stuff on it, you’re pretty much toast. Multi-signature was my hope to fix that. What we do is make one signature happen on the server machine, one signature happen on the client machine, your home machine. That way the attacker has to actually compromise two totally different systems in order to steal your bitcoin. That’s what multi-signature is about.”

Read more…

Comment