- It’s Getting Easier for Hackers to Spy on Your Computer When It’s Offline (Vice) — surprisingly readable coverage of determining computer activity from RF signals.
- An Old Fogey’s Analysis of a Teenager’s View on Social Media — Teens’ use of social media is significantly shaped by race and class, geography, and cultural background.
- Putting the Nuclear Option Front and Centre (Tom Armitage) — offering what feels like the nuclear option front and centre, reminding the user that it isn’t a nuclear option. I love this. “Undo” changes your experience profoundly.
- 3D-Printing Carbon Fibre (Makezine) — the machine doesn’t produce angular, stealth fighter-esque pieces with the telltale CF pattern seen on racing bikes and souped up Mustangs. Instead, it creates an FDM 3D print out of nylon filament (rather than ABS or PLA), and during the process it layers in a thin strip of carbon fiber, melted into place from carbon fiber fabric using a second extruder head. (It can also add in kevlar or fiberglass.)
At what layer do we build privacy into the fabric of devices?
Sign-up to attend Solid 2015 to explore the convergence of privacy, security, and the Internet of Things.
In 2011, Kashmir Hill, Gizmodo and others alerted us to a privacy gaffe made by Fitbit, a company that makes small devices to help people keep track of their fitness activities. It turns out that Fitbit broadcast the sexual activity of quite a few of their users. Realizing this might not sit well with those users, Fitbit took swift action to remove the search hits, the data, and the identities of those affected. Fitbit, like many other companies, believed that all the data they gathered should be public by default. Oops.
Does anyone think this is the last time such a thing will happen?
Fitness data qualifies as “personal,” but sexual data is clearly in the realm of the “intimate.” It might seem like semantics, but the difference is likely to be felt by people in varying degrees. The theory of contextual integrity says that we feel violations of our privacy when informational contexts are unexpectedly or undesirably crossed. Publicizing my latest workout: good. Publicizing when I’m in flagrante delicto: bad. This episode neatly exemplifies how devices are entering spaces where they’ve not tread before, physically and informationally. Read more…
Security is at the heart of the web.
We want to share. We want to buy. We want help. We want to talk.
At the end of the day, though, we want to be able to go to sleep without worrying that all of those great conversations on the open web will endanger the rest of what we do.
Making the web work has always been a balancing act between enabling and forbidding, remembering and forgetting, and public and private. Managing identity, security, and privacy has always been complicated, both because of the challenges in each of those pieces and the tensions among them.
Complicating things further, the web has succeeded in large part because people — myself included — have been willing to lock their paranoias away so long as nothing too terrible happened.
I talked for years about expecting that the NSA was reading all my correspondence, but finding out that yes, indeed they were filtering pretty much everything, opened the door to a whole new set of conversations and concerns about what happens to my information. I made my home address readily available in an IETF RFC document years ago. In an age of doxxing and SWATting, I wonder whether I was smart to do that. As the costs move from my imagination to reality, it’s harder to keep the door to my paranoia closed. Read more…
The best of European and American data privacy initiatives can come together for the betterment of all.
Editor’s note: This is part of a series of posts exploring privacy and security issues in the Internet of Things. The series will culminate in a free webcast by the series author Dr. Gilad Rosner: Privacy and Security Issues in the Internet of Things will happen on February 11, 2015 — reserve your spot today.
As devices become more intelligent and networked, the makers and vendors of those devices gain access to greater amounts of personal data. In the extreme case of the washing machine, the kind of data — who uses cold versus warm water — is of little importance. But when the device collects biophysical information, location data, movement patterns, and other sensitive information, data collectors have both greater risk and responsibility in safeguarding it. The advantages of every company becoming a software company — enhanced customer analytics, streamlined processes, improved view of resources and impact — will be accompanied by new privacy challenges.
A key question emerges from the increasing intelligence of and monitoring by devices: will the commercial practices that evolved in the web be transferred to the Internet of Things? The amount of control users have over data about them is limited. The ubiquitous end-user license agreement tells people what will and won’t happen to their data, but there is little choice. In most situations, you can either consent to have your data used or you can take a hike. We do not get to pick and choose how our data is used, except in some blunt cases where you can opt out of certain activities (which is often a condition forced by regulators). If you don’t like how your data will be used, you can simply elect not to use the service. But what of the emerging world of ubiquitous sensors and physical devices? Will such a take-it-or-leave it attitude prevail? Read more…