Intel, well known for having customized server chips for their customers, is now all set to tune chips for workloads of big data too. As we all must know, software is an integral part of building a chip design, customization is going to help the applications to collect, manage and analyse data a lot easier and quicker. The general manager of the big data solutions of Intel, Ron Kasabian, confirms the same.
As one of technology is to try and test in every way possible the new technologies that the companies develop, Intel is also on its own mission to figure out the ways that its chips can deliver better efficiency and give better performance in fields such as cloud data collection, predictive analysis as well as specific task processing.
Hadoop, a scalable computing environment, is already being released and distributed by the company, which manages huge sets of data, and now, with the chip improvements on the roll, the world famous chip manufacturing company is definitely seeing one of the better days.
With the company al gearing up for initiating the software, the executives also realize that it takes quite some time to get the silicon to the market. They base their actions on the knowledge that there are only so many places where one can optimize for silicon and at the same time, there are many more things to improve for optimization and performance.
Taking their lessons from software implementations and then looking forward to improve the silicon to fill up any void of the software, the chip process is expected to take about two more years.
via: PC World