- Depthy — new Google Camera app lets you capture some depth information, stored in metadata in the image. Nifty effects become possible.
- Coping with Stress and Burnout: Explanatory Power of Different Coping Strategies (PLoSone)– interesting taxonomy of burnout (overload, lack of development, and neglect) found by clustering responses to surveys, which also showed key signs. (via Psychological Science)
- Why is StackOverflow So Negative of Late? — my current theory is that social activities (sites, events, etc.) are journeys for cohorts. Newcomers don’t get as much from it, and the original cohort don’t enjoy newcomers. Social sites tend to rock at first until They arrive and ruin it for all. cf Burning Man. Newcomers will have to start their own site/event, but if they never get critical mass of the A-grade people who joined the first wave, their own event may fail.
- Surpassing Human-Level Face Verification Performance on LFW with GaussianFace (arXiv) — For the first time, the human-level performance in face verification (97.53%) on LFW [the standard "hard" face recognition data set] is surpassed. (via Medium)
Solid's long view includes biology as part of the creator's toolkit.
Tim O’Reilly subjected himself to an engaging Ask Me Anything session on Reddit earlier this week. The focus of the exchange was the Internet of Things, in anticipation of our Solid conference taking place next month.
We’re always listening for faint signals from our community about what they’re getting interested in, and one area that’s stood out to us is biology, which is becoming easier to experiment with at home, as a hobbyist, and through hackerspaces like Biocurious and Genspace. You’ll find a few threads on biology at Solid this year, but we’ve tagged it to be a little more central at Solid 2015. Beyond the hobbyist and health-related applications, we see synthetic biology as another way to translate between virtual and physical, like 3D printers and stereoscopic cameras.
Here’s an exchange from Tuesday’s Reddit thread that sums it up nicely.
What prompted the start of BioCoder? Are people really doing biotech in their garages in the same way that many computer hardware and software innovations happened?
A melting pot of technologists, makers and product minds will lead to a new wave of robotics companies.
Editor’s note: this post originally published on Chen’s blog Beyond the Bell Curve; this edited version is republished here with permission.
A couple years ago, I dug deep into the robotics space because I thought we were seeing the birth of exciting next-generation robotics companies that would reshape the way our society lives and thinks. Companies like Rethink Robotics, Industrial Perception, and Redwood Robotics emerged to tackle factory and warehouse logistics. Willow Garage was gaining notoriety for being a center of robotics talent and innovation that spawned many of these companies. Meanwhile, Amazon had just acquired Kiva for $775M, driving even more entrepreneurial excitement.
Where are these players now? Rethink had a well-publicized round of layoffs, and Willow Garage ceases to exist. Industrial Perception and Redwood Robotics were part of Google’s robotics shopping spree, and while acquisitions can inspire activity like Kiva’s did, Google’s purchases may have had the opposite effect. In one fell swoop, many of the most entrepreneurial and talented roboticists were shuttered away from the world. I often worry that this has caused the entire field to take a step back, or at least is a major progress inhibitor. No longer will the acquired talent build and support new technology for others to build upon, at least for now. What Google decides to do with the talent they purchased will have big ramifications for how the industry and field move forward. There’s potential for a positive outcome here. Perhaps these groups eventually will leave Google with an understanding of best practices in building and operating a business, something Google is quite good at. Read more…
Hacking lab equipment to make it programmable is a good first step toward lab automation.
In the new issue of BioCoder, Peter Sand writes about Hacking Lab Equipment. It’s well worth a read: it gives a number of hints about how standard equipment can be modified so that it can be controlled by a program. This is an important trend I’ve been watching on a number of levels, from fully robotic labs to much more modest proposals, like Sand’s, that extend programmability even to hacker spaces and home labs.
In talking to biologists, I’m surprised at how little automation there is in research labs. Automation in industrial labs, the sort that process thousands of blood and urine samples per hour, yes: that exists. But in research labs, undergrads, grad students, and post-docs spend countless hours moving microscopic amounts of liquid from one place to another. Why? It’s not science; it’s just moving stuff around. What a waste of mental energy and creativity.
Lab automation, though, isn’t just about replacing countless hours of tedium with opportunities for creative thought. I once talked to a system administrator who wrote a script for everything, even for only a simple one-liner. (Might have been @yesthattom, I don’t remember.) This practice is based on an important insight: writing a script documents exactly what you did. You don’t have to think about, “oh, did I add the f option on that rm -r / command?”; you can just look. If you need to do the same thing on another system, you can reproduce what you did exactly.
The technology is at risk of dying off — and that would be a shame.
iBeacons and various BLE technologies have the potential to shake up many established ways of doing business by streamlining interactions. Although there are potentially many uses for iBeacons, much of the initial discussion has focused on retail. (I’ll follow up with some examples of iBeacon applications outside retail in a future post.)
As I described in my initial post in this series, all an iBeacon does is send out advertisement packets. iBeacon transmissions let a receiver perform two tasks: uniquely identify what things they are near and estimate the distance to them. With such a simple protocol, iBeacons cannot:
- Receive anything. (Many iBeacon devices will have two-way Bluetooth interfaces so they can receive configurations, but the iBeacon specification does not require reception.)
- Report on clients they have seen. Wi-Fi based proximity systems use transmissions from mobile devices to uniquely identify visitors to a space. If you take a smartphone into an area covered by a Wi-Fi proximity system, you can be uniquely identified. Because an iBeacon is only a transmitter, it does not receive Bluetooth messages from mobile devices to uniquely identify visitors.
Mobile UX, Ideation Tools, Causal Consistency, and Intellectual Ventures Patent Fail
- Samsung UX (Scribd) — little shop of self-catalogued UX horrors, courtesy discovery in a lawsuit. Dated (Android G1 as competition) but rewarding to see there are signs of self-awareness in the companies that inflict unusability on the world.
- Tools for Ideation and Problem Solving (Dan Lockton) — comprehensive and analytical take on different systems for ideas and solutions.
- Don’t Settle for Eventual Consistency (ACM) — proposes “causal consistency”, prototyped in COPS and Eiger from Princeton.
- Intellectual Ventures Loses Patent Case (Ars Technica) — The Capital One case ended last Wednesday, when a Virginia federal judge threw out the two IV patents that remained in the case. It’s the first IV patent case seen through to a judgment, and it ended in a total loss for the patent-holding giant: both patents were invalidated, one on multiple grounds.