Four short links: 3 March 2015

Four short links: 3 March 2015

Wearable Warning, Time Series Data, App Cards, and Secure Comms

  1. You Guys Realize the Apple Watch is Going to Flop, Right? — leaving aside the “guys” assumption of its readers, you can take this either as a list of the challenges Apple will inevitably overcome or bypass when they release their watch, or (as intended) a list of the many reasons that it’s too damn soon for watches to be useful. The Apple Watch is Jonathan Ive’s new Newton. It’s a potentially promising form that’s being built about 10 years before Apple has the technology or infrastructure to pull it off in a meaningful way. As a result, the novel interactions that could have made the Apple watch a must-have device aren’t in the company’s launch product, nor are they on the immediate horizon. And all Apple can sell the public on is a few tweets and emails on their wrists—an attempt at a fashion statement that needs to be charged once or more a day.
  2. InfluxDB, Now With Tags and More UnicornsThe combination of these new features [tagging, and the use of tags in queries] makes InfluxDB not just a time series database, but also a database for time series discovery. It’s our solution for making the problem of dealing with hundreds of thousands or millions of time series tractable.
  3. The End of Apps as We Know ThemIt may be very likely that the primary interface for interacting with apps will not be the app itself. The app is primarily a publishing tool. The number one way people use your app is through this notification layer, or aggregated card stream. Not by opening the app itself. To which one grumpy O’Reilly editor replied, “cards are the new walled garden.”
  4. Signal 2.0Signal uses your existing phone number and address book. There are no separate logins, usernames, passwords, or PINs to manage or lose. We cannot hear your conversations or see your messages, and no one else can either. Everything in Signal is always end-to-end encrypted, and painstakingly engineered in order to keep your communication safe.
Comment
Four short links: 2 March 2015

Four short links: 2 March 2015

Onboarding UX, Productivity Vision, Bad ML, and Lifelong Learning

  1. User Onboarding Teardowns — the UX of new users. (via Andy Baio)
  2. Microsoft’s Productivity Vision — always-on thinged-up Internet everywhere, with predictions and magic by the dozen.
  3. Machine Learning Done WrongWhen dealing with small amounts of data, it’s reasonable to try as many algorithms as possible and to pick the best one since the cost of experimentation is low. But as we hit “big data,” it pays off to analyze the data upfront and then design the modeling pipeline (pre-processing, modeling, optimization algorithm, evaluation, productionization) accordingly.
  4. Ten Simple Rules for Lifelong Learning According to Richard Hamming (PLoScompBio) — Exponential growth of the amount of knowledge is a central feature of the modern era. As Hamming points out, since the time of Isaac Newton (1642/3-1726/7), the total amount of knowledge (including but not limited to technical fields) has doubled about every 17 years. At the same time, the half-life of technical knowledge has been estimated to be about 15 years. If the total amount of knowledge available today is x, then in 15 years the total amount of knowledge can be expected to be nearly 2x, while the amount of knowledge that has become obsolete will be about 0.5x. This means that the total amount of knowledge thought to be valid has increased from x to nearly 1.5x. Taken together, this means that if your daughter or son was born when you were 34 years old, the amount of knowledge she or he will be faced with on entering university at age 17 will be more than twice the amount you faced when you started college.
Comment
Four short links: 27 February 2015

Four short links: 27 February 2015

No Estimates, Brand Advertising, Artificial Intelligence, and GPG BeGone

  1. #NoEstimatesAllspaw also points out that the yearning to break the bonds of estimation is nothing new — he’s fond of quoting a passage from The Unwritten Laws of Engineering, a 1944 manual which says that engineers “habitually try to dodge the irksome responsibility for making commitments.” All of Allspaw’s segment is genius.
  2. Old Fashioned Snapchatget a few drinks in any brand advertiser and they’ll admit that the number one reason they know that brand advertising works is that, if they stop, sales inevitably drop.
  3. Q&A With Bruce Sterling on Artificial Intelligence — in which Sterling sounds intelligent, and the questioner sounds Artificial.
  4. GPG and Me (Moxie Marlinspike) — Even though GPG has been around for almost 20 years, there are only ~50,000 keys in the “strong set,” and less than 4 million keys have ever been published to the SKS keyserver pool ever. By today’s standards, that’s a shockingly small user base for a month of activity, much less 20 years. This was a great talk at Webstock this year.
Comment
Four short links: 26 February 2015

Four short links: 26 February 2015

Autocompletion, Colliding Trends, Microservices, and Writing Useful Code

  1. awesomplete — MIT-licensed ultra lightweight, usable, beautiful autocomplete with zero dependencies in Javascript.
  2. How to Seize the Opportunities when Megatrends Collide — excuse the cheesy title, the chart from PwC showing pairwise combination of trends, is interesting.
  3. Adopting Microservices at Netflix: Lessons for Architectural Designyou want to think of servers like cattle, not pets. If you have a machine in production that performs a specialized function, and you know it by name, and everyone gets sad when it goes down, it’s a pet. Instead you should think of your servers like a herd of cows. What you care about is how many gallons of milk you get. If one day you notice you’re getting less milk than usual, you find out which cows aren’t producing well and replace them. People for Ethical Treatment of Iron, your time has come!
  4. Your Job is Not to Write Code (Laura Klein) — I know what you’re thinking. This will all take so long! I’ll be so much less effective! This isn’t true. You’ll be far more effective because you will actually be doing your job. Amen to it all.
Comment
Four short links: 25 February 2015

Four short links: 25 February 2015

Bricking Cars, Mapping Epigenome, Machine Learning from Encrypted Data, and Phone Privacy

  1. Remotely Bricking Cars (BoingBoing) — story from 2010 where an intruder illegally accessed Texas Auto Center’s Web-based remote vehicle immobilization system and one by one began turning off their customers’ cars throughout the city.
  2. Beginning to Map the Human Epigenome (MIT) — Kellis and his colleagues report 111 reference human epigenomes and study their regulatory circuitry, in a bid to understand their role in human traits and diseases. (The paper itself.)
  3. Machine Learning Classification over Encrypted Data (PDF) — It is worth mentioning that our work on privacy-preserving classification is complementary to work on differential privacy in the machine learning community. Our work aims to hide each user’s input data to the classification phase, whereas differential privacy seeks to construct classifiers/models from sensitive user training data that leak a bounded amount of information about each individual in the training data set. See also The Morning Paper’s unpacking of it.
  4. Privacy of Phone Audio (Reddit) — unconfirmed report from Redditor I started a new job today with Walk N’Talk Technologies. I get to listen to sound bites and rate how the text matches up with what is said in an audio clip and give feed back on what should be improved. At first, I though these sound bites were completely random. Then I began to notice a pattern. Soon, I realized that I was hearing peoples commands given to their mobile devices. Guys, I’m telling you, if you’ve said it to your phone, it’s been recorded…and there’s a damn good chance a 3rd party is going to hear it.
Comment
Four short links: 24 February 2015

Four short links: 24 February 2015

Open Data, Packet Dumping, GPU Deep Learning, and Genetic Approval

  1. Wiki New Zealand — open data site, and check out the chart builder behind the scenes for importing the data. It’s magic.
  2. stenographer (Google) — open source packet dumper for capturing data during intrusions.
  3. Which GPU for Deep Learning?a lot of numbers. Overall, I think memory size is overrated. You can nicely gain some speedups if you have very large memory, but these speedups are rather small. I would say that GPU clusters are nice to have, but that they cause more overhead than the accelerate progress; a single 12GB GPU will last you for 3-6 years; a 6GB GPU is plenty for now; a 4GB GPU is good but might be limiting on some problems; and a 3GB GPU will be fine for most research that looks into new architectures.
  4. 23andMe Wins FDA Approval for First Genetic Test — as they re-enter the market after FDA power play around approval (yes, I know: one company’s power play is another company’s flouting of safeguards designed to protect a vulnerable public).
Comment