- Rise of the Patent Troll: Everything is a Remix (YouTube) — primer on patent trolls, in language anyone can follow. Part of the fixpatents.org campaign. (via BoingBoing)
- Petabytes of Field Data (GigaOm) — Farm Intelligence using sensors and computer vision to generate data for better farm decision making.
- Bullish on Blockchain (Fred Wilson) — our 2014 fund will be built during the blockchain cycle. “The blockchain” is bitcoin’s distributed consensus system, interesting because it’s the return of p2p from the Chasm of Ridicule or whatever the Gartner Trite Cycle calls the time between first investment bubble and second investment bubble under another name.
- Hemingway — online writing tool to help you make your writing clear and direct. (via Nina Simon)
"sensor networks" entries
Government sensor networks can streamline processes, cut labor costs, and improve services.
It’s not news to anyone who works in government that we live in a time of ever-tighter budgets and ever-increasing needs. The 2013 federal shutdown only highlighted this precarious situation: government finds it increasingly difficult to summon the resources and manpower needed to meet its current responsibilities, yet faces new ones after each Congressional session.
Sensor networks are an important emerging technology that some areas of government already are implementing to bridge the widening gap between the demand to reduce costs and the demand to improve services. The Department of Defense, for instance, uses RFID chips to monitor its supply chain more accurately, while the U.S. Geological Survey employs sensors to remotely monitor the bacterial levels of rivers and lakes in real time. Additionally, the General Services Administration has begun using sensors to measure and verify the energy efficiency of “green” buildings (PDF), and the Department of Transportation relies on sensors to monitor traffic and control traffic signals and roadways. All of which is productive, but more needs to be done. Read more…
The IoT requires thinking about how humans and things cooperate differently when things get smarter.
Rod Smith of IBM and I had a call the other day to prepare for our onstage conversation at O’Reilly’s upcoming Solid Conference, and I was surprised to find how much we were in agreement about one idea: so many of the most interesting applications of the Internet of Things involve new ways of thinking about how humans and things cooperate differently when the things get smarter. It really ought to be called the Internet of Things and Humans — #IoTH, not just #IoT!
Let’s start by understanding the Internet of Things as the combination of sensors, a network, and actuators. The “wow” factor — the magic that makes us call it an Internet of Things application — can come from creatively amping up the power of any of the three elements.
For example, a traditional “dumb” thermostat consists of only a sensor and an actuator — when the temperature goes out of the desired range, the heat or air conditioning goes on. The addition of a network, the ability to control your thermostat from your smartphone, say, turns it into a simple #IoT device. But that’s the bare-base case. Consider the Nest thermostat: where it stands out from the crowd of connected thermostats is that it uses a complex of sensors (temperature, moisture, light, and motion) as well as both onboard and cloud software to provide a rich and beautiful UI with a great deal more intelligence. Read more…
Proximity is the 'Hello World' of mobility.
As any programmer knows, writing the “hello, world” program is the canonical elementary exercise in any new programming language. Getting devices to interact with the world is the foundation of the Internet of Things, and enabling devices to learn about their surroundings is the “hello world” of mobility.
On a recent trip to Washington D.C., I attended the first DC iBeacon Meetup. iBeacons are exciting. Retailers are revolutionizing shopping by applying new indoor proximity technologies and developing the physical world analog of the data that a web-based retailer like Amazon can routinely collect. A few days ago, I tweeted about an analysis of the beacon market, which noted that “[beacons] are poised to transform how retailers, event organizers, transit systems, enterprises, and educational institutions communicate with people indoors” — and could even be used in home automation systems.
I got to see the ground floor of the disruption in action at the meetup in DC, which featured presentations by a few notable local companies, including Radius Networks, the developer of the CES scavenger hunt app for iOS. When I first heard of the app, I almost bought a ticket to Las Vegas to experience the app for myself, so it was something of a cool moment to hear about the technology from the developer of an application that I’d admired from afar.
After the presentations, I had a chance to talk with David Helms of Radius. Helms was drawn to work at Radius for the same reason I was compelled to attend the iBeacon meetup. As he put it, “The first step in extending the mobile computing experience beyond the confines of that slab of glass in your pocket is when it can recognize the world around it and interact with it, and proximity is the ‘Hello’ of the Internet of Things revolution.” Read more…
Sensor Networks, Programming Silliness, Higher Order C, and Meeting Silliness
- Pete Warden on Sensors — We’re all carrying little networked laboratories in our pockets. You see a photo. I see millions of light-sensor readings at an exact coordinate on the earth’s surface with a time resolution down to the millisecond. The future is combining all these signals into new ways of understanding the world, like this real-time stream of atmospheric measurements.
- Quine Relay — This is a Ruby program that generates Scala program that generates Scheme program that generates …(through 50 languages)… REXX program that generates the original Ruby code again.
- Cello — a GNU99 C library which brings higher level programming to C. Interfaces allow for structured design, Duck Typing allows for generic functions, Exceptions control error handling, Constructors/Destructors aid memory management, Syntactic Sugar increases readability.
- The Meeting (John Birmingham) — satirising the Wall Street Journal’s meeting checklist advice.
Street View Tiles Hacks, Policy Simulation, Map Tile Toolbox, and Connected Sensor Device HowTo
- HyperLapse — this won the Internet for April. Everyone else can go home. Check out this unbelievable video and source is available.
- Housing Simulator — NZ’s largest city is consulting on its growth plan, and includes a simulator so you can decide where the growth to house the hundreds of thousands of predicted residents will come from. Reminds me of NPR’s Budget Hero. Notice that none of the levers control immigration or city taxes to make different cities attractive or unattractive. Growth is a given and you’re left trying to figure out which green fields to pave.
- Converting To and From Google Map Tile Coordinates in PostGIS (Pete Warden) — Google Maps’ system of power-of-two tiles has become a defacto standard, widely used by all sorts of web mapping software. I’ve found it handy to use as a caching scheme for our data, but the PostGIS calls to use it were getting pretty messy, so I wrapped them up in a few functions. Code on github.
- So You Want to Build A Connected Sensor Device? (Google Doc) — The purpose of this document is to provide an overview of infrastructure, options, and tradeoffs for the parts of the data ecosystem that deal with generating, storing, transmitting, and sharing data. In addition to providing an overview, the goal is to learn what the pain points are, so we can address them. This is a collaborative document drafted for the purpose of discussion and contribution at Sensored Meetup #10. (via Rachel Kalmar)
Why general purpose computing will diffuse into our environment.
I’ve put forward my opinion that desktop computing is dead on more than one occasion, and been soundly put in my place as a result almost every time. “Of course desktop computing isn’t dead — look at the analogy you’re drawing between the so called death of the mainframe and the death of the desktop. Mainframes aren’t dead, there are still plenty of them around!”
Well, yes, that’s arguable. But most people, everyday people, don’t know that. It doesn’t matter if the paradigm survives if it’s not culturally acknowledged. Mainframe computing lives on, buried behind the scenes, backstage. As a platform it performs well, in its own niche. No doubt desktop computing is destined to live on, but similarly behind the scenes, and it’s already fading into the background.
The desktop will increasingly belong to niche users. Developers need them, at least for now and for the foreseeable future. But despite the prevalent view in Silicon Valley, the world does not consist of developers. Designers need screen real estate, but buttons and the entire desktop paradigm are a hack; I can foresee the day when the computing designers use will not even vaguely resemble today’s desktop machines.
For the rest of the world? Computing will almost inevitably diffuse out into our environment. Today’s mobile devices are transition devices, artifacts of our stage of technology progress. They too will eventually fade into their own niche. Replacement technologies, or rather user interfaces, like Google’s Project Glass are already on the horizon, and that’s just the beginning.
People never wanted computers; they wanted what computers could do for them. Almost inevitably the amount computers can do for us on their own, behind our backs, is increasing. But to do that, they need data, and to get data they need sensors. So the diffusion of general purpose computing out into our environment is inevitable. Read more…