FEATURED STORY

Four short links: 30 January 2015

FAA Rules, Sports UAVs, Woodcut Data, and Concurrent Programming

  1. FAA to Regulate UAVs? (Forbes) — and the Executive Order will segment the privacy issues related to drones into two categories — public and private. For public drones (that is, drones purchased with federal dollars), the President’s order will establish a series of privacy and transparency guidelines. See also How ESPN is Shooting the X Games with Drones (Popular Mechanics)—it’s all fun and games until someone puts out their eye with a quadrocopter. The tough part will be keeping within the tight restrictions the FAA gave them. Because drones can’t be flown above a crowd, Calcinari says, “We basically had to build a 500-foot radius around them, where the public can’t go.” The drones will fly over sections of the course that are away from the crowds, where only ESPN production employees will be. That rule is part of why we haven’t seen drones at college football games.
  2. Milestones for SaaS Companies“Getting from $0-1m is impossible. Getting from $1-10m is unlikely. And getting from $10-100m is inevitable.” —Jason Lemkin, ex-CEO of Echosign The article proposes some significant milestones, and they ring true. Making money is generally hard. The nature of the hard changes with the amount of money you have and the amount you’re trying to make, but if it were easy then we’d structure our society on something else.
  3. Woodcut Data VisualisationRecently, I learned how to operate a laser cutter. It’s been a whole lot of fun, and I wanted to share my experiences creating woodcut data visualizations using just D3. I love it when data visualisations break out of the glass rectangle.
  4. Why is Concurrent Programming Hard?on the one hand there is not a single concurrency abstraction that fits all problems, and on the other hand the various different abstractions are rarely designed to be used in combination with each other. We are due for a revolution in programming, something to help us make sense of the modern systems made of more moving parts than our feeble grey matter can model and intuit about.
Comment

A human-centered approach to data-driven design

The O'Reilly Radar Podcast: Arianna McClain on humanizing data-driven design, and Dirk Knemeyer on design in emerging tech.

This week on the O’Reilly Radar Podcast, O’Reilly’s Roger Magoulas talks with Arianna McClain, a senior hybrid design researcher at IDEO, about storytelling through data; the interdependent nature of qualitative and quantitative data; and the human-centered, data-driven design approach at IDEO.

Subscribe to the O’Reilly Radar Podcast

iTunes, SoundCloud, RSS

In their interview, Magoulas noted that in our research at O’Reilly, we’ve been talking a lot about the importance of the social science design element in getting the most out of data. McClain emphasized the importance of storytelling through data at IDEO and described IDEO’s human-centered approach to data-driven design:

“IDEO really believes in staying and remaining human-centered throughout the data journey. Starting off with, how might we measure something, how might we measure a behavior. We don’t sit in a room and come up with an algorithm or come up with a question. We start by talking to people. … We’re trying to build measures and survey questions to understand at scale how people make decisions. … IDEO remains data-driven to how we analyze and synthesize our findings. When we’re given a large data set, we don’t analyze it and write a report and give it to people and say, ‘This is the direction we think you should go.’

“Instead, we look at segmentations in the data, and stories in the data, and how the data clusters. Then we go back, and we try to find people who are representative of that cluster or that segmentation. The segmentations, again, are not based on demographic variables. They are based on needs and insights that we heard in our qualitative research. … What we’ve recognized is that something that seems so clear in the analysis is often very nuanced, and it can inform our design.”

Read more…

Comment

The evolution of GraphLab

The O'Reilly Data Show Podcast: Carlos Guestrin on the early days of GraphLab and the evolution of GraphLab Create.

Editor’s note: Carlos Guestrin will be part of the team teaching Large-scale Machine Learning Day at Strata + Hadoop World in San Jose. Visit the Strata + Hadoop World website for more information on the program.

I only really started playing around with GraphLab when the companion project GraphChi came onto the scene. By then I’d heard from many avid users and admired how their user conference instantly became a popular San Francisco Bay Area data science event. For this podcast episode, I sat down with Carlos Guestrin, co-founder/CEO of Dato, a start-up launched by the creators of GraphLab. We talked about the early days of GraphLab, the evolution of GraphLab Create, and what’s he’s learned from starting a company.

MATLAB for graphs

Guestrin remains a professor of computer science at the University of Washington, and GraphLab originated when he was still a faculty member at Carnegie Mellon. GraphLab was built by avid MATLAB users who needed to do large scale graphical computations to demonstrate their research results. Guestrin shared some of the backstory:

“I was a professor at Carnegie Mellon for about eight years before I moved to Seattle. A couple of my students, Joey Gonzales and Yucheng Low were working on large scale distributed machine learning algorithms specially with things called graphical models. We tried to implement them to show off the theorems that we had proven. We tried to run those things on top of Hadoop and it was really slow. We ended up writing those algorithms on top of MPI which is a high performance computing library and it was just a pain. It took a long time and it was hard to reproduce the results and the impact it had on us is that writing papers became a pain. We wanted a system for my lab that allowed us to write more papers more quickly. That was the goal. In other words so they could implement this machine learning algorithms more easily, more quickly specifically on graph data which is what we focused on.”

Read more…

Comment

It’s not just about Hadoop core anymore

For maximum business value, big data applications have to involve multiple Hadoop ecosystem components.

Data is deluging today’s enterprise organizations from ever-expanding sources and in ever-expanding formats. To gain insight from this valuable resource, organizations have been adopting Apache Hadoop with increasing momentum. Now, the most successful players in big data enterprise are no longer only utilizing Hadoop “core” (i.e., batch processing with MapReduce), but are moving toward analyzing and solving real-world problems using the broader set of tools in an enterprise data hub (often interactively) — including components such as Impala, Apache Spark, Apache Kafka, and Search. With this new focus on workload diversity comes an increased demand for developers who are well-versed in using a variety of components across the Hadoop ecosystem.

Due to the size and variety of the data we’re dealing with today, a single use case or tool — no matter how robust — can camouflage the full, game-changing potential of Hadoop in the enterprise. Rather, developing end-to-end applications that incorporate multiple tools from the Hadoop ecosystem, not just the Hadoop core, is the first step toward activating the disparate use cases and analytic capabilities of which an enterprise data hub is capable. Whereas MapReduce code primarily leverages Java skills, developers who want to work on full-scale big data engineering projects need to be able to work with multiple tools, often simultaneously. An authentic big data applications developer can ingest and transform data using Kite SDK, write SQL queries with Impala and Hive, and create an application GUI with Hue. Read more…

Comment: 1

Now available: Big Data Now, 2014 edition

Our wrap-up of important developments in the big data field.

In the four years we’ve been producing Big Data Now, our wrap-up of important developments in the big data field, we’ve seen tools and applications mature, multiply, and coalesce into new categories. This year’s free wrap-up of Radar coverage is organized around seven themes:

  • Cognitive augmentation: As data processing and data analytics become more accessible, jobs that can be automated will go away. But to be clear, there are still many tasks where the combination of humans and machines produce superior results.
  • Intelligence matters: Artificial intelligence is now playing a bigger and bigger role in everyone’s lives, from sorting our email to rerouting our morning commutes, from detecting fraud in financial markets to predicting dangerous chemical spills. The computing power and algorithmic building blocks to put AI to work have never been more accessible.
  • Read more…

Comment

Designing on a system level

Andy Goodman on service design, embeddables, and predictive analytics.

connection_jazbeck_Flickr

I recently sat down with Andy Goodman, designer and group director of Fjord’s US studios. Goodman has been designing and managing design teams around the globe for the past 20 years. Goodman is a contributor to Designing for Emerging Technologies — our conversation covers embeddables, wearables, and predictive analytics. To kick off the conversation, I asked Goodman to define “service design”:

“It’s well-known that if you ask a service designer to define “service design,” you get 10 different answers. For me, it’s really about thinking on a system level about design … It’s thinking about how systems, and not just computer systems, but how human systems and computer systems and physical systems all interact with each other. You need to be thinking not about individual moments; you need to be thinking about journeys and flows, and thinking about how a human being will naturally, without even thinking about it, move from one context to another using different devices, using physical objects, being in physical spaces. For me, it was very appealing, this idea that you can design more than just interactions in a way, more than just interactions on a screen. You can actually design other things that are more about the way we live and work and play.”

Read more…

Comment