Gathering Intel on a new reality
Intel CEO Brian Krzanich reveals the innovation, strategies, and partnerships that are driving the company forward, including its groundbreaking Merged Reality project .

By Gary Maidment and Linda Xu
Best known for the iconically accurate tagline “Intel Inside”, the silicon giant is currently making some very exciting moves in the tech world. We recently spoke to Intel CEO Brian Krzanich to find out more about the partnerships and innovation that are propelling the company forward.
Team spirit
“The most successful collaboration comes when there’s a shared vision,” remarked Krzanich during his keynote speech at HUAWEI CONNECT 2016. “Customers are no longer looking for a platform; they’re really looking for experiences.” With a commitment to open collaboration underpinned by an understanding of what consumers want, Krzanich’s words are reflected in a diverse product and solutions portfolio that increasingly focuses on providing compelling experiences for users. According to Krzanich, “This requires multiple collaborations between companies. No one partner can deliver this.”
Changing reality
One of the most exciting ways Intel is appealing to users under a collaborative framework is with its merged reality (MR) offering. A bold concept framed within Project Alloy, MR is taking the augmented and virtual worlds to the next level. According to Krzanich, MR will “bring your physical world into the virtual world [so you] can choose when they mix and how they mix.”
The hardware Krzanich talks of is as impressive as you’d expect, setting a high bar for others to follow. Unlike other headsets, the Intel prototype is a self-contained piece of kit completely unburdened by cables, external batteries, and the need for external computing or processing power.
Project Alloy employs Intel’s RealSense perceptual technology, which uses three cameras – 1080p, infrared, and infrared laser projector – and 3D depth-sensing to enable a machine that “sees” in the same way as a human. These cameras act in concert and can, for example, scan and map a user’s hands and insert them into a virtual space with which they can interact, tech that Intel engineers demoed at the IDF.
Intel plans to roll out products next summer, with open source development kits expected to be available in the second half of this year. “We’ve developed the hardware. We developed the software that rides on it. We’ve worked out the usage models,” reveals Krzanich. ”And we’re going to let people build off that platform and really create an open ecosystem so that it grows at a much faster rate.”
Intel is thus providing the tools and setting the stage for developers to get creative, an approach the company has also employed with its perceptual computing strategy, which aims to augment the way people interact with computers. In 2013, Intel Software started running an annual developer challenge designed to foster innovation using Intel RealSense SDK for Windows.
Onwards and upwards
MR is just one example of how Intel is using its hardware expertise as a springboard into other areas, including cloud. “Our silicon capability has really let us expand beyond just PCs,” states Krzanich, with the explosion of data arising from multiple channels helping fuel this paradigm shift, “We’re really focusing on providing both the edge devices and clouds.” To do so, the company is investing in edge devices like Curie, a button-sized computer designed for wearables, and products such as Xeon Phi, which provides cloud computing capabilities. He describes this strategic expansion as transforming from a “single cylinder engine” to a “multiple cylinder engine,” an approach that’s essential to spreading outwards into more fields.
Krzanich differentiates between today’s cloud, which is built by people in the form of Facebook messages, WeChat messages, emails, and so on, and the cloud of the future: “The cloud of tomorrow is going to be built on the backs of machines,” he asserts. “So it’s going to be everything, [including] the wearables that you start to wear more and more, that transmit data. Your car is going to become a huge connected cloud device, and your factory, your home, everything we have.” The expected explosion in data also explains why data centers are a core facet of Intel’s business, alongside memory and IoT.
Intel predicts that, by 2020, the average person will generate 1.5 GB of data on average, up from 600 to 700 MB today. At the same time, hospitals will produce 3,000 GB per day and a single autonomous car 4,000 GB, the equivalent of 3,000 people. For Krzanich, this “is an opportunity, since we’re doing the building blocks of the cloud.”
Analytics and AI
Data, of course, has little value until it’s collected, analyzed, and actioned. Krzanich believes that this provides an excellent opportunity for Intel’s partners as well as itself: “We have analytical tools like Snap and TAP that are designed to allow users to really go and put analytics to the data.” Crucial to this is AI, which he expects to play a bigger role in data analytics as IoT and data sets become more complex, “Over time we can start to combine data and use artificial intelligence to really start understanding [and] contextualize.”
Intel has made decisive moves into the AI arena, having acquired the deep learning start up Nervana Systems and leading cognitive computing platform Saffron to boost its capabilities in this burgeoning area and, according to Krzanich, “provide algorithms to our co-partners to help them go faster in analytics.”
To help the transition into the data-heavy world, Huawei is building end-to-end 5G infrastructure using Intel technology in preparation for global 5G trials.
With a strong commitment to partnerships, Intel has diversified its business segments on the way to becoming a major player in the globe’s cutting-edge tech space, which will no doubt result in a reality that’s as bright as it is merged.