About 8 years ago Apple steamrolled the mobile telephony industry with the first iPhone. They drastically disrupted the symbiosis of operators and device manufacturers. Nokia and Motorola did not survive this mayhem and knowing the operators of 15 years ago they are still have not recovered.
After visiting the ETSI OneM2M workshop in Nice for three days I am wondering if history will repeat itself? It feels like the telecom industry never analyzed why Apple ate their lunch and thought about how to defend itself against the next attack. In the workshop the work is about standardizing protocols, abstract reference semantic reference models, and maybe some open source influence. The underlying rationale is the somewhat tired lesson that collaborating on protocols will enable interoperability, which will increase the pie many times. True, but how do we prevent that an Apple will come again and steal the pie under our nose?
Apple succeeded so easily because it hit the soft underbelly of the mobile telekom industry: software. Software was proprietary in the telekom industry, protocols were paramount. Only after NTT Docomo succeeded on generating revenues on applications did the industry enable a severely crippled software model on the phones. I did participate in an attempt of Motorola, Nokia, IBM and others to set a better software standard based on OSGi just before the iPhone hit. I can ensure you that we didn't stand a chance because the focus was on irrelevant aspects like managing the device, constraining the application developer, and lowering the cost. Instead the focus should have been on what independent developers could do with a programmable device.
The rest was history.
The iPhone enabled Facebook, WhatsApp, Google Maps, and all of the other millions of applications because anybody could write cool applications for it which is the truest source of innovation.
The telekom industry is now sitting on the fence of a huge new market: The Internet of Things. The industry is eminently suited to provide the connectivity and having first row access to the humongous pie of IoT services. Instead of learning the lessons of the mobile telephone industry it feels like history will repeat itself.
It is the software, stupid!
Peter Kriens
by Tim Verbelen, iMinds, Ghent University
One year ago, I was speaking on my first EclipseCon Europe about the results of my PhD research (Mobilizing the Cloud with AIOLOS). Since then, I have been working for iMinds as a research engineer. iMinds is a digital research centre in Belgium, which joins the 5 Flemish universities in conducting strategic and applied research in areas such as ICT, Media and Health.
Therefore, iMinds is uniquely positioned to bring together multi-disciplinary teams to work on various emerging topics. I myself am working within the iMinds IoT lab, which works on a wide area of topics related to IoT, ranging from antenna design and wireless MAC protocols, to software security and distributed computing, the last one being my main expertise.
In our research, we try to not only come up with theoretical solutions, but also strive to create tangible results in the form of demonstrations and proof of concepts. As a researcher, you get more freedom in choosing which technology to use in building your solutions, in contrary to in industry, where you are often tied to a lot of legacy software. For IoT, the choice of using OSGi was made early on, which proved to be a good fit for a lot of IoT requirements.
One challenge in IoT environments is the hardware heterogeneity you have to cope with. You need to deploy (parts of) software on a wide variety of devices ranging from embedded devices up to high-end servers in the Cloud. As OSGi was initially designed for service gateways, it is well suited to run on even lower-end devices. The recent work in the Enterprise Expert group also equipped it with a lot of features to operate in a server environment. This makes OSGi a perfect fit as a base for an IoT platform, as the modularity allows you to pick and place the software modules you need on any of your devices.
A second challenge where OSGi really shines is the dynamics you have to cope with when developing IoT applications. As the physical world is constantly changing, you will need to adapt to new devices that are coming online and other devices that disappear at runtime. This is incredibly hard to manage in software as the complexity increases. The OSGi bundle and service model already handles these dynamics and offers the developer a nice and easy way to cope with this using for example Declarative Services.
Third, OSGi offers a nice solution for software distribution with the Remote Services specification. This enables you to delay the decision on which part has to run on which device until deployment time or even at runtime, instead of having this fixed already at development time. This gives you a lot more flexibility in deploying complex applications on a distributed infrastructure.
In order to even better match IoT industry requirements, the OSGi Alliance has recently started anIoT Expert Group, which will build on the already available specification work and where additional IoT-specific RFPs can be submitted to become part of the OSGi specification.
In my talk, I will present and demo some IoT use cases we have developed, and illustrate how we really benefit from using OSGi. If you are interested in OSGi and/or IoT, you are invited to attend my session,OSGi for IoT: the good, the bad and the ugly at EclipseCon 2015 in Ludwigsburg.