We need to build APIs for Things that are interoperable — we need an application layer for the IoT.
Register for our free webcast “Building IoT Systems with Web Standards,” which will be hosted by Vlad Trifa and Dominique Guinard on December 8, 2015, at 10 a.m. PT.
When the term “IoT” was first coined, the idea was to move from a model where data is generated by humans bridging media gaps between the physical and the virtual worlds to a model where data is gathered by the Things themselves.
Fifteen years later, we’re moving in the right direction to make this a reality, but we still have several challenges ahead. One major challenge is interoperability: many Things do talk using the Internet, but they don’t talk the same language. Having been involved in the IoT for about as long as it’s been around, I’m pretty sure of one thing: a universal networking protocol for the IoT will never exist — and for a good reason! The IoT is a vast world where the needs of one field (e.g. Industry 4.0) to another (e.g. the smart home) are fundamentally different. As a consequence, the list of automation protocols is actually growing, not shrinking.
A consequence of these different needs is the focus on the connectivity aspect of the IoT. This is not unusual, but as we ascend the pyramid of IoT needs, we must think about the data interoperability of Things. We need to build APIs for Things that are interoperable; in short, we need an application layer for the IoT.
What bio can learn from the open source work of Tesla, Google, and Red Hat.
When building a biotech start-up, there is a certain inevitability to every conversation you will have. For investors, accelerators, academics, friends, baristas, the first two questions will be: “what do you want to do?” and “have you got a patent yet?”
Almost everything revolves around getting IP protection in place, and patent lawyer meetings are usually the first sign that your spin-off is on the way. But what if there was a way to avoid the patent dance, relying instead on implementation? It seems somewhat utopian, but there is a precedent in the technology world: open source.
What is open source? Essentially, any software in which the source code (the underlying program) is available to anyone else to modify, distribute, etc. This means that, unlike typical proprietary development processes, it lends itself to collaborative development between larger groups, often spread out across large distances. From humble beginnings, the open source movement has developed to the point of providing operating systems (e.g. Linux), Internet browsers (Firefox), 3D modelling software (Blender), monetary alternatives (Bitcoin), and even integrating automation systems for your home (OpenHab).
Money, money, money…
The obvious question is then, “OK, but how do they make money?” The answer to this lies not in attempting to profit from the software code itself, but rather from its implementation as well as the applications which are built on top of it. For the implementation side, take Red Hat Inc., a multinational software company in the S&P 500 with a market cap of $14.2 billion, who produce the extremely popular Red Hat Enterprise Linux distribution. Although open source and freely available, Red Hat makes its money by selling a thoroughly bug-tested operating system and then contracting to provide support for 10 years. Thus, businesses are not buying the code; they are buying a rapid response to any problems.
The O'Reilly Radar Podcast: Zoe Keating on retaining control, making money, and remaining optimistic in the music industry.
Subscribe to the O’Reilly Radar Podcast to track the technologies and people that will shape our world in the years to come.
Keating talks about why she chooses to retain total control over her music as opposed to signing with a label, she shares her thoughts on the changing landscape of how artists get paid, and she talks about why she’s optimistic about the future of music.
Here are a few highlights from their conversation:
The digital tracking of music isn’t necessarily fully formed yet. I think we’re actually still in the infancy of what the industry is going to be like. We’ve had a great decade of disruption that has benefited artists like me and Amanda [Palmer]. I’m not sure that she or I could have had the kinds of careers we’ve had without this thing that has been bad for some other artists. I don’t think that things are fully realized yet. It’s still the case that the majority of the money is going to people other than the content creators.
I’m the only artist on the face of the earth who’s never made a video.
It’s almost like we need music futurists to try to envision the future of what might your life be like. How might it work? How might you make money? Just think about it in a free-form way without all the emotion attached to it, because often you’ve probably experienced numerous music conferences where it devolves into name calling.
The O’Reilly Data Show podcast: Evangelos Simoudis on data mining, investing in data startups, and corporate innovation.
Subscribe to the O’Reilly Data Show Podcast to explore the opportunities and techniques driving big data and data science.
Can developments in data science and big data infrastructure drive corporate innovation? To be fair, many companies are still in the early stages of incorporating these ideas and tools into their organizations.
Evangelos Simoudis has spent many years interacting with entrepreneurs and executives at major global corporations. Most recently, he’s been advising companies interested in developing long-term strategies pertaining to big data, data science, cloud computing, and innovation. He began his career as a data mining researcher and practitioner, and is counted among the pioneers who helped data mining technologies get adopted in industry.
In this episode of the O’Reilly Data Show, I sat down with Simoudis and we talked about his thoughts on investing, data applications and products, and corporate innovation:
Open source software companies
I very much appreciate open source. I encourage my portfolio companies to use open source components as appropriate, but I’ve never seen the business model as being one that is particularly easy to really build the companies around them. Everybody points to Red Hat, and that may be the exception, but I have not seen companies that have, on the one hand, remained true to the open source principles and become big and successful companies that do not require constant investment. … The revenue streams never prove to be sufficient for building big companies. I think the companies that get started from open source in order to become big and successful … [are] ones that, at some point, decided to become far more proprietary in their model and in the services that they deliver. Or they become pure professional services companies as opposed to support services companies. Then they reach the necessary levels of success.
Comparing different orchestration tools.
Most software systems evolve over time. New features are added and old ones pruned. Fluctuating user demand means an efficient system must be able to quickly scale resources up and down. Demands for near zero-downtime require automatic fail-over to pre-provisioned back-up systems, normally in a separate data centre or region.
On top of this, organizations often have multiple such systems to run, or need to run occasional tasks such as data-mining that are separate from the main system, but require significant resources or talk to the existing system.
When using multiple resources, it is important to make sure they are efficiently used — not sitting idle — but can still cope with spikes in demand. Balancing cost-effectiveness against the ability to quickly scale is difficult task that can be approached in a variety of ways.
All of this means that the running of a non-trivial system is full of administrative tasks and challenges, the complexity of which should not be underestimated. It quickly becomes impossible to look after machines on an individual level; rather than patching and updating machines one-by-one they must be treated identically. When a machine develops a problem it should be destroyed and replaced, rather than nursed back to health.
Various software tools and solutions exist to help with these challenges. Let’s focus on orchestration tools, which help make all the pieces work together, working with the cluster to start containers on appropriate hosts and connect them together. Along the way, we’ll consider scaling and automatic failover, which are important features.
The O’Reilly Solid Podcast: Entrepreneurship, niche product development, and spotting business opportunities.
Subscribe to the O’Reilly Solid Podcast for insight and analysis about the Internet of Things and the worlds of hardware, software, and manufacturing.
Many of the hardware creators we speak with come into their work through the enthusiast route: they start with an engineering problem they want to solve or a piece of technology they think is interesting, then look for an application that would support a business.
Julia Ko, our guest on this week’s episode of the Solid Podcast, started her company SurePod a very different way. She saw the business opportunity first, studying wholesale mobile contracts and the sales networks that distribute medical devices, and she developed a plan for a simplified mobile phone for older people. Only then did she learn the technical aspects of hardware production.
In this episode, we talk about Ko’s development as an entrepreneur, the challenge of creating a product for which you aren’t the target audience, and the best mobile phone carrier (Ko says it’s AT&T).