Data waves

Businesses don’t know what to do with Big Data

Pro
Source: Stockfresh

31 August 2015

Companies are spending billions on tools and engineering to analyse big data, though many are hampered by one little problem: they still do not know what to do with all the data they collect.

“This is the dirty little secret about big data: no one actually knows what to do with it,” Jason Waxman, an Intel vice president and general manager of the company’s cloud platforms group, said in a webcast for investors.

“They think they know what to do with it, and they know they have to collect it, because you have to have a big data strategy. But deriving the insights from big data is a little harder to do,” he said.

Big data is all about collecting large amounts of sensor or process data, the analysis of which can lead to insights into customer behaviour and point the way to improvements in operational efficiency.

“They think they know what to do with it, and they know they have to collect it, because you have to have a big data strategy. But deriving the insights from big data is a little harder,” Jason Waxman, Intel

Intel is interested in the big data market because big data systems will require lots of processor-driven hardware, preferably Intel’s.

Today, big data sparks about $13 billion (€11.6 billion) a year in IT spending, a figure Intel estimates will balloon to $41 billion (€37 billion) by 2018, with at least $2 billion (€1.8 billion) or so of that money earmarked for hardware.

To get value from big data, enterprises must get past a number of hurdles, Waxman said.

The company talked to a number of organisations to find out more about their use, and anticipated use, of big data. It found that the number one challenge is figuring out how to extract value from the data.

Demanding task
It is a demanding task. Organisations need the right talent to assemble and run big data systems, which requires skills in statistics and analytical reasoning in addition to the more usual programming and system administration.

“The ability to do this all together is pretty rare,” Waxman said.

Intel has undertaken a number of initiatives to help organisations start to get value from all of their information.

One is finding and highlighting successful big data operations. When a retailer, for instance, finds a successful way to improve a customer experience through big data, Intel documents the operation “to help more people replicate that,” Waxman said.

Another big challenge is making big data systems easier to deploy. Right now, organisations are assembling these systems piece by piece, which can involve a lot of configuration and integration.

“Instead of having people write a bunch of programs and stitch together big computers, we need to find a way to make it easier for people to deploy” big data systems, Waxman said.

Investment end
To this end, Intel has invested in a number of big data software providers. Last year, Intel invested $740 million in Cloudera, which offers a commercial distribution of the Hadoop data processing platform. Together, the companies worked on a roadmap for Cloudera software that will take advantage of the advanced features in Intel’s processor architectures.

Despite the current popularity of cloud-based software services, Waxman predicted that most companies will want to run their big data operations in-house, rather than hand off their data and analysis to third-party services.

Waxman recalled talking with an executive at a financial firm who confided in him about a recent meeting with IBM, which offered its Watson cognitive computing services. The executive found the technology intriguing but expressed concern about using a proprietary service, as well as in handing over data to a company that could be a competitor at some point.

“People want to own their own data. If they give away their data, they don’t have much of a company left,” Waxman said.

 

Joab Jackson, IDG News Service

Read More:


Back to Top ↑

TechCentral.ie