With the evolution of computing, the volume, velocity, and variety of data is exploding at a scale the enterprise is finding hard to grapple with. And it’s this phenomenon that propels Intel to surf the next big wave of computing across datacenters, high performance computing, and AI.
Here's what Prakash Mallya, the man at the helm of Intel's datacenter group, believes will help the company stay ahead of its game.
What does the datacenter business mean for Intel’s portfolio?
Datacenter is a significant and growing part of our portfolio. Our results have been growing in double digits in the recent past and has accelerated significantly.
It is a significant proportion of our operating margins and growth portfolio, and our expectation is that it will continue to grow in double digits.
If you look our investments, we are starting to invest pretty significantly, and a couple of investments which come to mind are the Ortella and Nervana acquisitions.
We are moving forward with integrating their map with ours and we see it has a very interesting thing in the AI space – a buzzing and one of the fastest growing workload in the datacenter space.
From a proprietary approach, we have made it horizontal. Now if you see in the last seven years, we have progressed to include storage and network.
Has Intel changed its approach in the datacenter space? How so?
If you look at our investments and if you look at where the datacenter business is headed, at the highest level we would see this getting ‘horizontalized’.
So from a proprietary approach, we made it horizontal. Now if you see in the last seven years, we have progressed to include storage and network.
Network transformation on open standard based technology, orchestrated by software is still a distance off because we have just begun the journey of network workloads getting virtualized, hence being able to drive tremendous amount of agility for network players.
If you look at everything on the infrastructure side – compute, storage and network are getting virtualized – orchestrated by software.
It gives us tremendous amount of cost advantages. If you can work compute, storage and network seamlessly and manage the workloads across all three on a software basis, then it would be very flexible and it will make the changes on the fly. Now this is not possible today because that’s the way the datacenters are orchestrated on the server and various other levels, which gives you limited flexibility.
What’s Intel’s channel partner strategy in India for 2017
Channel partners for us is the traditional business - the likes of Lenovo, Acer, the OEMs. And when we're getting into a solutions space, we'll have a new set of partners, like the system integrators, such as Wipro, TCS, Infosys, or L&T.
Intel has always been a partner-agnostic company, and will continue to be so. But, it's only that from a pure product player, we're also moving in as a solution player.
How different would a datacenter look in the next five years?
What you're seeing today in terms of workloads and data is not even a fraction of what you'll be seeing in the future - with smart cities and autonomous driving.
The way a datacenter would look, will have software orchestration, in general. So, if you look at the physical aspect, I'd say it would run much more on software, than what we see today. It would be much more seamless on workloads than you see today.
So, visually, a datacenter would look totally different, because we would have a distributed architecture with a lot more compute power.
Our objective is to provide the best offering to the end user at the lowest cost of ownership, with open standard based platforms that are software orchestrated. That in a nutshell, is our datacenter approach.
Intel has expressed a lot of interest in AI and deep learning. What does Intel’s acquisition of Nervana signify for its business?
Companies like Nervana try really cool innovations on neural networks, and this has increased the accuracy of these neural networks.
We of course plan to keep up with the innovation and deliver better CPU architectures and microprocessor performance from generation to generation. Even storage technology and the infrastructure around the storage capability has to be increased, because the access and speed would become a critical differentiator between whether we can do something real time or not.
Also, our effort is to drive the framework in an open source model. So, people have standardized APIs, standardized interfaces, and optimized frameworks. Then everybody can take advantage of that.
There are so many small companies, and if they are given a proprietary solution, it would be very hard to scale. But if you are given an open source framework which is optimized and then they take advantage of it and bring innovation on top, then you have magic.
Therefore the start-up culture, which I believe, is a thriving sector in India, will act as pillar of success in new emerging areas like machine learning and deep learning.
Nervana's Neon framework is very similar to our philosophy. The more people that can take advantage of the framework and write on the basis of standard APIs in any vertical, then the industry benefits. If you make it proprietary, who can take advantage of it?
The role of open source in AI
I think you would see artificial intelligence as a place where open source is really going to thrive, and our belief is that high performance computing has always prided itself on being open source oriented.
I you look at all the frameworks, they are open source. Our objective is to optimize those frameworks to run best on our platform.
Nervana is pretty similar. Their Neon framework is very similar to our philosophy. The more people that can take advantage of the framework and write on the basis of standard APIs in any vertical, then the industry benefits. If you make it proprietary, who can take advantage of it?
And if you look at the AI story, it's going to be a bunch of different companies that don't even exist today. We're just at the starting point of the artificial intelligence journey.
You don't yet have autonomous driving. You don't yet have smart energy. You don't yet have so many other services that can be enabled with smart network. Once that happens, the amount of data that is going to come in is going to be order of magnitude higher.
We've also had to make sure that the legacy systems needed to be integrated too. And only when you have open source, you can bring the new and old age systems together.