IoT: Looking For The Next Microsoft

I am not here to tell you how wonderful it is that we have IoT now and how IoT helps everyone sell more devices and CPUs. Instead, I will take this opportunity to talk about what's missing with IoT and where we're going in this area.

We need to put everything in perspective. Even though most of us might see IoT as a new trend in the computer industry, IoT is just another natural step in the evolution of computers, just like parallel computing, iPad, and perceptual computing. In fact, there is something in common with all these new topics in the computing industry: They were all there 30 years ago. We went back to parallel computing. Originally Parallel Computing was hardware based (using Interrupts) and today it is software based, but the basic concepts were rediscovered and researched. I now see the same pattern with IoT. It took us a few years to reinvent parallel programming, just to find out that it was already there in dusty old books. With your permission, I will try to take the shortcut this time and my journey with IoT will now start immediately with the old and dusty books.

Back in the 1970's there were three basic ways to create an electronic product: Analog, Digital, and Processor. Today most devices incorporate all three methods, mostly based on this rule: If you can use a processor then do it, if you have to use digital hardware then you have to, unless you really must have analog components and then if there is no way to avoid it, you can use it. Back then it was mostly based on: Analog hardware is simple to understand and cheap, digital hardware adds complexity so use it when you have to, and a processor is expensive, consumes energy and complex to incorporate to the design so never use it unless you really see a good reason to do so.

Computers started with 8 bit bytes... or 9 bit, 7 bits, and other alternatives. There were several CPU busses and using the CPU to move data was very slow and wasteful of CPU time. There were many types of peripheral devices and every computer had to come with the correct drivers to its peripherals. This is where we are today with IoT.

Then came IBM with Intel and Microsoft. This is the next natural step for IoT, and I'll try to explain.

If we look at pictures of the early desktop PCs we see many chips and components:

Most of the components on the board are used for either of: User Interface, Data Storage, or Internal Use. For example the RAM is used internally by the system. The keyboard of example has a FIFO in hardware which stores up to 16 characters. When the keyboard FIFO is full the machine will beep every time you press another key. Pressing a key on the keyboard could type it immediately to screen and save it to file, but instead of doing that, the PC system sends everything via the CPU. The CPU receives a signal in the form of Hardware Interrupt to notify it of a new keyboard 'type' event. It will then take the pressed key and print it to screen. This is a very basic functionality to it was supported by the out-of-the-box BIOS system which is the first code to run and initialize the system. The BIOS was a minimal collection of drivers and the most basic software to operate the machine. It was great. Having a BIOS meant that software companies do not have to write their own drivers and could use the BIOS drivers as API. Nothing is hardwired in this system and everything goes through the CPU which means that every behavior in the system is configurable by software. On the other hand, once it was decided that a data buffer should go to some destination, there is no real need in having to waste CPU time for the copying process. This was solved by having a DMA component in the system which allows one peripheral device to copy data directly to another peripheral device without CPU intervention, except that the DMA is controlled only by the CPU.

Dusty Books:

Intel didn't really want to manufacture the small peripheral devices. Intel's main business was with the smart CPU, system architecture, and BUS specifications. Intel also added advanced operations such as fast string search operations and floating point capabilities. The CPU was very expensive and very fast compared to the peripherals.

This infrastructure allowed Microsoft to grow from a small application over the BIOS, to an extensive infrastructure (DOS Int 21h), and later to a completely new operating system which bypasses most of the BIOS functionality.

In time the peripherals on the PC were grouped to form the PCI bridge and eventually absorbed by the CPU in SoC (System on Chip) model. This is because technology advanced. PC networks used to be room wide or extended to several buildings and today we have a global Internet network. Hardware configuration was based on physical connectivity and today it is Plug & Play. The CPU in a PC used to be the fastest device and enjoyed more RAM and storage than all other devices put together. BIOS used to initialize the peripherals to work together and share Interrupt lines and priorities. Operating System used to manage operations which cross the boundary of several devices, for example typing on the keyboard which prints on screen and saves to disk.

Now let's talk about IoT:

Intel doesn't really want to manufacture the small peripheral devices like hats and bracelets. Intel's main business is with the CPU board, system architecture, and BUS specifications. Intel also has advanced operations such as face recognition, voice recognition, and fingerprint database search capabilities. The CPU board is very expensive and very fast compared to the peripheral devices.

When such an infrastructure is ready, it will allow the next Microsoft to grow from a small application over the basic connectivity framework, to an extensive infrastructure support, and later to a completely new operating system which does not exist today.

Today we are used to Plug & Play device configuration for example internally using PCI bus or externally using USB bus. This means that IoT devices must have interface classes to identify them so that the system will be able to interact with them without requiring an external driver installation. That said, it makes sense for the system to be connected to the Internet and since there will be only one or two major operating systems, it only makes sense to use Microsoft's model of driver database which is downloaded when a hardware ID is detected.

I would expect to see an Arduino device connected to a microphone, detecting silence periods, and capturing only what sounds like speech and not random noise. The cheap Arduino device will then forward the audio packets to an Intel Galileo board for speech recognition. In another scenario it is possible to have a simple device displaying the most important two emails on a hand watch, after an Intel Galileo board evaluated which are these two emails based on a complex search algorithm. In the world of IoT devices may connect to the central system using relays. This means that it will be more efficient for a speaker device to connect directly to the multimedia system than via a centralized CPU board. Based on (dusty book) models above such connections should managed and allowed by the CPU board. This is similar to the DMA model, only now it will be direct pairing.

It should take a while for Intel and the rest of the industry to complete the model, but you can't sell desktops without an operating system. From this I deduce that IoT is going to have hard times penetrating the market until we find the next Microsoft. Microsoft by the way is not that "next Microsoft" because Microsoft is good with business instruments. It is not to say that Microsoft don't have enough money to write another operating system. The problem is the corporate DNA. Everything is based on that DNA. Microsoft keep trying to push network goods but Google is always first. Google tries to push social goods but Facebook is always first. Everyone can try to sell phone operating systems but unless you give it away for free you can't take Apple out of the game. The next Microsoft is Iot oriented from day one and has a single person who sees the vision of IoT. No money can replace that.

If you are this person or that company, I would say that it is time to step up. Right about now.


For more complete information about compiler optimizations, see our Optimization Notice.