- Fastly’s S3 Latency Monitor — The graph represents real-time response latency for Amazon S3 as seen by Fastly’s Ashburn, VA edge server. I’ve been watching #sandy’s effect on the Internet in real-time, while listening to its effect on people in real-time. Amazing.
- Button Upgrade (Gizmodo) — elegant piece of button design, for sale on Shapeways.
- Inside a Dozen USB Chargers — amazing differences in such seemingly identical products. I love the comparison between genuine and counterfeit Apple chargers. (via Hacker News)
- Why Products Fail (Wired) — researcher scours the stock market filings of publicly-listed companies to extract information about warranties. Before, even information like the size of the market—how much gets paid out each year in warranty claims—was a mystery. Nobody, not analysts, not the government, not the companies themselves, knew what it was. Now Arnum can tell you. In 2011, for example, basic warranties cost US manufacturers $24.7 billion. Because of the slow economy, this is actually down, Arnum says; in 2007 it was around $28 billion. Extended warranties—warranties that customers purchase from a manufacturer or a retailer like Best Buy—account for an estimated $30.2 billion in additional claims payments. Before Arnum, this $60 billion-a-year industry was virtually invisible. Another hidden economy revealed. (via BoingBoing)
"real-time data" entries
A new report describes an imminent shift in real-time applications and the data architecture they require.
The era is here: we’re starting to see computers making decisions that people used to make, through a combination of historical and real-time data. These streams of data come together in applications that answer questions like:
- What news items or ads is this website visitor likely to be interested in?
- Is current network traffic part of a Distributed Denial of Service attack?
- Should our banking site offer a visitor a special deal on a mortgage, based on her credit history?
- What promotion will entice this gamer to stay on our site longer?
- Is a particular part of the assembly line overheating and need to be shut down?
Such decisions require the real-time collection of data from the particular user or device, along with others in the environment, and often need to be done on a per-person or per-event basis. For instance, leaderboarding (determining who is top candidate among a group of users, based on some criteria) requires a database that tracks all the relevant users. Such a database nowadays often resides in memory. Read more…
The Internet of Things allows for real-time data monitoring, which is crucial to regulatory reform.
One under-appreciated aspect of the changing relationship between the material world and software is that material goods can and will fail — sometimes with terrible consequences.
What if government regulations were web-based and mandated inclusion of Internet-of-Things technology that could actually stop a material failure, such as a pipeline rupture or automotive failure, while it was in its earliest stages and hadn’t caused harm? Even more dramatically, what if regulations could even prevent failures from happening at all?
With such a system, we could avoid or minimize disasters — from Malaysia Airlines Flight 370’s disappearance to the auto-safety debacles at GM to a possible leak if the Keystone XL pipeline is built — while the companies using this technology could simultaneously benefit in a variety of profitable ways. Read more…
New tools make it easier for companies to process and mine streaming data sources
Stream processing was in the minds of a few people that I ran into over the past week. A combination of new systems, deployment tools, and enhancements to existing frameworks, are behind the recent chatter. Through a combination of simpler deployment tools, programming interfaces, and libraries, recently released tools make it easier for companies to process and mine streaming data sources.
Of the distributed stream processing systems that are part of the Hadoop ecosystem0, Storm is by far the most widely used (more on Storm below). I’ve written about Samza, a new framework from the team that developed Kafka (an extremely popular messaging system). Many companies who use Spark express interest in using Spark Streaming (many have already done so). Spark Streaming is distributed, fault-tolerant, stateful, and boosts programmer productivity (the same code used for batch processing can, with minor tweaks, be used for realtime computations). But it targets applications that are in the “second-scale latencies”. Both Spark Streaming and Samza have their share of adherents and I expect that they’ll both start gaining deployments in 2014.
Remote monitoring appeals to management, but good applications create value for those being monitored as well.
The industrial Internet makes data available at levels of frequency, accuracy and breadth that managers have never seen before, and the great promise of this data is that it will enable improvements to the big networks from which it flows. Huge systems can be optimized by taking into account the status of every component in real time; failures can be preempted; deteriorating performance can be detected and corrected.
But some of this intelligence can be a hard sell to those being monitored and measured, who worry that hard-learned discretion might be overridden by an engineer’s idealistic notion of how things should work. In professional settings, workers worry about loss of agency, and they imagine that they’ll be micro-managed on any minor variation from normal. Consumers worry about loss of privacy.
The best applications of the industrial Internet handle this conflict by generating value for those being monitored as well as those doing the monitoring — whether by giving workers the information they need to improve the metrics that their managers see, or by giving consumers more data to make better decisions.
Fort Collins (Colo.) Utilities, for instance, expects to recoup its $36 million investment in advanced meters in 11 years through operational savings. The meters obviate the need for meter readers, and the massive amounts of data they produce is useful for detecting power failures and predicting when transformers will need to be replaced before a damaging explosion occurs. Read more…
AWS Redshift and BitYota launch, big data's problems could shift to real time, and NYPD may be crossing a line with cellphone records.
Here are a few stories from the data space that caught my attention this week.
Amazon, BitYota launch data warehousing services
Amazon announced the beta launch of its Amazon Web Services data warehouse service Amazon Redshift this week. Paul Sawers at The Next Web reports that Amazon hopes to democratize data warehousing services, offering affordable options to make such services viable for small businesses while enticing large companies with cheaper alternatives. Depending on the service plan, customers can launch Redshift clusters scaling to more than a petabyte for less than $1,000 per terabyte per year.
So far, the service has drawn in some big players — Sawers notes that the initial private beta has more than 20 customers, including NASA/JPL, Netflix, and Flipboard.
Brian Proffitt at ReadWrite took an in-depth look at the service, noting its potential speed capabilities and the importance of its architecture. Proffitt writes that Redshift’s massively parallel processing (MPP) architecture “means that unlike Hadoop, where data just sits cheaply waiting to be batch processed, data stored in Redshift can be worked on fast — fast enough for even transactional work.”
Proffitt also notes that Redshift isn’t without its red flags, pointing out that a public cloud service not only raises issues of data security, but of the cost of data access — the bandwidth costs of transferring data back and forth. He also raises concerns that this service may play into Amazon’s typical business model of luring customers into its ecosystem bits at a time. Proffitt writes:
“If you have been keeping your data and applications local, shifting to Redshift could also mean shifting your applications to some other part of the AWS ecosystem as well, just to keep the latency times and bandwidth costs reasonable. In some ways, Redshift may be the AWS equivalent of putting the milk in the back of the grocery store.”
In related news, startup BitYota also launched a data warehousing service this week. Larry Dignan reports at ZDNet that BitYota is built on a cloud infrastructure and uses SQL technology, and that service plans will start at $1,500 per month for 500GB of data. As to competition with AWS Redshift, BitYota co-founder and CEO Dev Patel told Dignan that it’s a non-issue: “[Redshift is] not a competitor to us. Amazon is taking the traditional data warehouse and making it available. We focus on a SaaS approach where the hardware layer is abstracted away,” he said.
Sandy's Latency, Better Buttons, Inside Chargers, and Hidden Warranties
Startups tap realtime marketing, NFC in the U.K.'s post office, and banks need to remain "top of wallet."
An ecommerce startup aims to understand real-time consumer intent, a 350-year-old post office embraces mobile, and mobile wallets could disrupt brick-and-mortar banks. (Commerce Weekly is produced as part of a partnership between O'Reilly and PayPal.)