- Matt Jones: Practical Design Fiction (Vimeo) — the log scale of experience! Fantastic hour-long recap of the BERG thinking that he’s continued at the Google Creative Lab in NYC. (via Matt Jones)
- 3dPL — public license for 3d objects. (via BoingBoing)
- Google Contributor — when the web’s biggest advertiser tries alternative ways to fund web content, I’m interested.
- Templar — an HTTP proxy that provides advanced features to help you make better use of and tame HTTP APIs. Timeouts, caching, metrics, request collapsing, …
Modern design products should be dynamic, adaptable systems built in code — and as our design products become dynamic, it makes less and less sense to separate the design and implementation. Read more...
Like the Internet in 1994, virtual reality is about to cross the chasm from core technologists to the wider world.
When you’re an entrepreneur or investor struggling to bring a technology to market just a little before its time, being too early can feel exactly the same as being flat wrong. But with a bit more perspective, it’s clear that many of the hottest companies and products in today’s tech landscape are actually capitalizing on ideas that have been tried before — have, in some cases, been tackled repeatedly, and by very smart teams — but whose day has only now just arrived.
Virtual reality (VR) is one of those areas that has seduced many smart technologists in its long history, and its repeated commercial flameouts have left a lot of scar tissue in their wake. Despite its considerable ups and downs, though, the dream of VR has never died — far from it. The ultimate promise of the technology has been apparent for decades now, and many visionaries have devoted their careers to making it happen. But for almost 50 years, these dreams have outpaced the realities of price and performance.
To be fair, VR has come a long way in that time, though largely in specialized, under-the-radar domains that can support very high system costs and large installations; think military training and resource exploration. But the basic requirements for mass-market devices have never been met: low-power computing muscle; large, fast displays; and tiny, accurate sensors. Thanks to the smartphone supply chain, though, all of these components have evolved very rapidly in recent years — to the point where low-cost, high-quality, compact VR systems are now becoming available. Consumer VR really is coming on fast now, and things are getting very interesting. Read more…
Scott Stropkay and Bill Hartman on human-robot interaction, choice architecture, and developing degrees of trust.
Jonathan Follett, editor of Designing for Emerging Technologies, recently sat down with Scott Stropkay, founding partner at Essential Design Service, and Bill Hartman, director of research at Essential Design Service, both of whom are also contributing authors for Designing for Emerging Technologies. Their conversation centers around the relationship dynamic between humans and robots, and they discuss ways that designers are being stretched in an interesting new direction.
Accepting human-robot relationships
Stropkay and Hartman discussed their work with telepresence robots. They shared the inherent challenges of introducing robots in a health care setting, but stressed that there’s tremendous opportunity for improving the health care experience:
“We think the challenges inherent in these kinds of scenarios are fascinating, how you get people to accept a robot in a relationship that you normally have with a person. Let’s say, a hospital setting — how do you develop acceptance from the team that’s not used to working with a robot as part of their functional team, how do you develop trust in those relationships, how do you engage people both practically and emotionally. How, as this scenario progresses, you bring robots into your home to monitor your recovery is one of the issues we’ve begun to address in our work.
“We’re pursuing other ideas in relations to using smart monitors, in the form of robot and robotic enhanced devices that can help you advance your improvement in behavior change over time … Ultimately, we’re thinking about some of the interesting science that’s happening with robots that you ingest that can learn about you and monitor you. There’s a world of fascinating issues about what you want to know, and how you might want to learn that, who gets access to this information, and how that interface could be designed.”
The O'Reilly Radar Podcast: John Carnahan on holistic data analysis, engagement channels, and data science as an art form.
In this Radar Podcast episode, I sit down with John Carnahan, executive vice president of data science at Ticketmaster. At our recent Strata + Hadoop World Conference in San Jose, CA, Carnahan presented a session on using data science and machine learning to improve ticket sales and marketing at Ticketmaster.
I took the opportunity to chat with Carnahan about Ticketmaster’s evolving approach to data analysis, the avenues of user engagement they’re investigating, and how his genetics background is informing his work in the big data space.
When Carnahan took the job at Ticketmaster about three years ago, his strategy focused on small, concrete tasks aimed at solving distinct nagging problems: how do you address large numbers of tickets not sold at an event, how do you engage and market those undersold events to fans, and how do you stem abuse of ticket sales. This strategy has evolved, Carnahan explained, to a more holistic approach aimed at bridging the data silos within the company:
“We still want those concrete things, but we want to build a bed of data science assets that’s built on top of a company that’s been around almost 40 years and has a lot of data assets. How do we build the platform that will leverage those things into the future, beyond just those small niche products that we really want to build. We’re trying to bridge the gap between a lot of those products, too. Rather than think of each of those things as a vertical or a silo that’s trying to accomplish something, it’s how do you use something that you’ve built over here, over there to make that better?”
A design process paved with empathic observations will lead you, slowly and iteratively, to a better product.
Editor’s note: this post was originally published on the author’s blog, Exploring the world beyond mobile; this lightly edited version is republished here with permission.
If I’m ever asked what’s most important in UX design, I always reply “empathy.” It’s the core meta attribute, the driver that motivates everything else. Empathy encourages you to understand who uses your product, forces you to ask deeper questions, and motivates the many redesigns you go through to get a product right.
But empathy is a vague concept that isn’t strongly appreciated by others. There have been times when talking to product managers that my empathy-driven fix-it list will get a response like, “We appreciate that Scott, but we have so much to get done on the product, we don’t have time to tweak things like that right now.” Never do you feel so put in your place when someone says that your job is “tweaking.”
The paradox of empathy is that while it drives us at a very deep level, and ultimately leads us to big, important insights, it usually starts small. The empathic process typically notices simple things like ineffective error messages, observed user workarounds, or overly complicated dialog boxes. Empathy starts with very modest steps. However, these small observations are the wedge that splits the log; it’s these initial insights, if you follow them far enough, that open up your mind and lead you to great products.
Finding the holes in qualitative and quantitative testing.
I can’t tell you how often I hear things from engineers like, “Oh, we don’t have to do user testing. We’ve got metrics.” Of course, you can almost forgive them when the designers are busy saying things like, “Why would we A/B test this new design? We know it’s better!”
In the debate over whether to use qualitative or quantitative research methods, there is plenty of wrong to go around. So, let’s look at some of the myths surrounding qualitative and quantitative research, and the most common mistakes people make when trying to use them.