Technology has changed, but humans haven't — what is it about mediating an experience through a frame that makes it seem better?
ImpactLab has posted a nice pair of photos contrasting 2005 and 2013 in St. Peter’s Square. 2005 looks pretty much as you’d expect: lots of people in a crowd. In 2013, though, everyone is holding up a tablet, either photographing or perhaps even watching the event through the tablet.
The ImpactLab post asks about the changes in our technology during these eight years. That’s interesting, but not what grabs me. What gets me is that this isn’t new. In the 18th century, one fad was to view nature through a portable picture frame. I wasn’t able to find this in a quick Google search, but screw the documentation. I’ve seen these things in a museum: they look like a miniature gilded picture frame, roughly the size of an iPad, with a stick coming from the corner so you can hold it before your eyes. So you’d sit in your carriage with the curtains open, look out the window through this frame, and see a moving picture. A slightly higher-tech variant of this is the Claude Glass (see, I can haz links), in which you viewed the natural scene through a slightly tinted mirror, to make it look even more like a painting. (This is arguably the origin of the term “picturesque.”) Read more…
Raise consciousness about the silliness of the Black Friday ritual — do anything but shop.
This time last year, Cathy O’Neil and I traded emails about the US’s annual orgy of consumerism. I promised her an article for her Mathbabe blog, which I still owe her, and we wondered how to raise consciousness about the silliness of the Black Friday ritual.
I remembered something a friend and I did back in grad school: we waited in line for movies. And that was it — we went to downtown Palo Alto, picked the movie theater with the longest line (this when the homogeneous corporate 87-screen chainplexes were just getting started), and waited in the line. We had no intention of watching the movie, so if the person in back of us was at all anxious, we’d let them cut in front. And we’d explain “we’re just waiting in line; we’re not seeing the movie anyway, so go ahead — it’s cool.” People thought this was strange, or funny, or whatever. When we got to the ticket counter, we excused ourselves and went back to the end of the line, or maybe to another theater; I don’t remember.
I’m no longer going to get up at midnight to wait in line for the local Walmart to open its doors, but I’d like to know that someone did this, and in the process, raised awareness of our addiction to consumerism, raised of awareness of workers who aren’t paid adequately. Sing Christmas Carols. Sing Channukah songs. Sing old Beatles tunes. Sing Atheist Carols, if anyone has written any. Do anything but shop.
And I’d like to hear any other ideas about pranking this most ridiculous of national rituals.
USB could make power consumption more intelligent, but security concerns need to be addressed.
I’ve been reading about enhancements to the USB 3.0 standard that would allow a USB cable to provide up to 100 watts of power, nicely summarized in The Economist. 100 watts is more than enough to charge a laptop, and certainly enough to power other devices, such as LED lighting, televisions, and audio equipment. It could represent a significant shift in the way we distribute power in homes and offices: as low voltage DC, rather than 110 or 220 volt AC. Granted, 100 watts won’t power a stove, a refrigerator, or a toaster, but in a USB world, high-voltage power distribution could be limited to a few rooms, just like plumbing; the rest of the building could be wired with relatively inexpensive USB cables and connectors, and the wiring could easily be done by amateurs rather than professional electricians.
It’s an interesting and exciting idea. As The Economist points out, the voltages required for USB are easily compatible with solar power. Because USB cables also carry data, power consumption can become more intelligent.
But I have one concern that I haven’t seen addressed in the press. Of course USB cables carry both data and power. So, when you plug your device into a USB distribution system, whether it’s a laptop or phone, you’re plugging it into a network. And there are many cases, most notoriously Stuxnet, of computers being infected with malware through their USB ports. Read more…
Feedback is an elegant and effective way to control complex, dynamic processes.
Everyone knows what feedback is. It’s when sound systems suddenly make loud, painful screeching sounds. And that answer is correct, at least partly.
Control theory, the study and application of feedback, is a discipline with a long history. If you’ve studied electrical or mechanical engineering, you’ve probably confronted it. Although there’s an impressive and daunting body of mathematics behind control theory, the basic idea is simple. Whenever you have a varying signal, you can use feedback to control the signal, giving you a consistent output. Screaming amps at a concert are nothing but a special case in which things have gone wrong.
We use control theory all the time, without even thinking about it. We couldn’t walk if it weren’t for our body’s instinctive use of feedback; upsetting that feedback system (for example, by spinning to become dizzy) makes you fall. When you’re driving a car, you ease off the accelerator when it’s going too fast. You press the accelerator when it’s going too slow. If you undercorrect, you’ll end up going too fast (or stopping); if you overcorrect, you’ll end up jerking forward, slamming on the brakes, then jerking forward again — possibly with disastrous consequences. Cruise control is nothing more than a robotic implementation of the same feedback loop. Read more…
Our readers are the largest group of DIY biologists ever assembled.
We’ve been having a great time — more than 6,000 downloads, almost 13,000 visits to the landing page, and we don’t know how many people have shared it. Ryan Bethencourt observed that our readers are the largest group of DIY biologists that has ever been assembled. This is big — and we still don’t know how big.
Thanks for a great start! We’re looking forward to a second issue in mid-Jauary. And if you haven’t yet read the first issue of BioCoder, it’s time for you to check it out.
An O'Reilly newsletter covering the biology revolution and connecting the many people working in DIY bio.
We’re pleased to announce BioCoder, a newsletter on the rapidly expanding field of biology. We’re focusing on DIY bio and synthetic biology, but we’re open to anything that’s interesting.
Why biology? Why now? Biology is currently going through a revolution as radical as the personal computer revolution. Up until the mid-70s, computing was dominated by large, extremely expensive machines that were installed in special rooms and operated by people wearing white lab coats. Programming was the domain of professionals. That changed radically with the advent of microprocessors, the homebrew computer club, and the first generation of personal computers. I put the beginning of the shift in 1975, when a friend of mine built a computer in his dorm room. But whenever it started, the phase transition was thorough and radical. We’ve built a new economy around computing: we’ve seen several startups become gigantic enterprises, and we’ve seen several giants collapse because they couldn’t compete with the more nimble startups.
We’re seeing the same patterns in biology today. You can build homebrew lab equipment for a fraction of the price of commercial equipment; we’re seeing amateurs do meaningful research and experimentation; and we’re seeing new tools that radically drop the cost of experimentation. We’re also seeing new startups that have the potential for changing the economy as radically as the advent of inexpensive computing.
BioCoder is the newsletter of the biology revolution. Read more…
Problems with GM foods lie not in genetics, but in the structure of industrial farming.
But that’s really not what the headline said. The GM crops didn’t kill the butterflies — abuse of a herbicide did. It’s very important to distinguish between first order and second order effects. The milkweed would be just as dead if the farmers applied the Roundup directly to the milkweed. And, assuming that the farmers are trying to kill weeds other than milkweed (which only grows at the edges of the field), the caterpillars would survive if farmers applied Roundup more precisely, just to the crops they were trying to protect. Is it safe to eat corn that’s been genetically modified so that it’s Roundup resistant? I have no problem with the genetics; but you might think twice about eating corn that has been doused with a potent herbicide. Do you wash your food carefully? Good.
However one defines "enterprise," what really matters is an organization's culture.
Bill Higgins of IBM and I have been working on an article about DevOps in the enterprise. DevOps is mostly closely associated with Internet giants and web startups, but increasingly we are observing companies we lump under the banner of “enterprises” trying — and often struggling — to adopt the sorts of DevOps culture and practices we see at places like Etsy. As we tried to catalog the success and failure patterns of DevOps adoption in the enterprise, we ran into an interesting problem: we couldn’t precisely define what makes a company an enterprise. Without a well understood context, it was hard to diagnose inhibitors or to prescribe any particular advice.
So, we decided to pause our article and turn our minds to the question “What is an enterprise, anyway?” We first tried to define an enterprise based on its attributes, but as you’ll see, these are problematic:
- More then N employees
- Definitions like this don’t interest us. What changes magically when you cross the line between 999 and 1,000 employees? Or 9,999 and 10,000? Wherever you put the line, it’s arbitrary. I’ll grant that 30-person companies work differently from 10,000 person companies, and that 100-person companies have often adopted the overhead and bureaucracy of 10,000 person companies (not a pretty sight). But drawing an arbitrary line in the sand isn’t helpful.
What business leaders need to know about data and data analysis to drive their businesses forward.
A couple of years ago, Claudia Perlich introduced me to Foster Provost, her PhD adviser. Foster showed me the book he was writing with Tom Fawcett, and using in his teaching at NYU.
Foster and Tom have a long history of applying data to practical business problems. Their book, which evolved into Data Science for Business, was different from all the other data science books I’ve seen. It wasn’t about tools: Hadoop and R are scarcely mentioned, if at all. It wasn’t about coding: business students don’t need to learn how to implement machine learning algorithms in Python. It is about business: specifically, it’s about the data analytic thinking that business people need to work with data effectively.
Data analytic thinking means knowing what questions to ask, how to ask those questions, and whether the answers you get make sense. Business leaders don’t (and shouldn’t) do the data analysis themselves. But in this data-driven age, it’s critically important for business leaders to understand how to work with the data scientists on their teams. In today’s business world, it’s essential to understand which algorithms are used for different applications, how statistics are used to create models of human and economic behavior, overfitting and its symptoms, and much more. You might not need to know how to implement a machine learning algorithm, but you do need to understand the ideas the data scientists on your team are using.
The goal of data science is putting data to work. That’s what Data Science for Business is all about, and the reason I’m excited to see us publishing it. There are many books about data science, and an increasing number of undergraduate and graduate programs in data science. But I haven’t seen anything that teaches data science for the leaders who will be using data to drive their businesses forward.