Skip to the content

Menu

How IBM will push Big Data to new frontiers

Big Data

Big Data has been a big matter for major companies for a while, but there are strong suggestions that the multi-billion dollar market hasn't even started to realise its full potential.

Software revenue in the market is already worth an estimated $46 billion, according to figures from 2015. However, this is only a small portion of what it could be worth in just a few years.

A report from SNS Research has predicted that companies will spend around $72 billion on Big Data hardware, software, and professional services by 2020.

But why is Big Data so important?

Big Data is becoming an increasingly relevant part of all areas of life, with estimates suggesting it could save healthcare could save as much as $300 billion a year if utilised properly. For companies, it helps them get to know their audience and customers better. In theory, this allows them to deliver a more bespoke service, giving individuals exactly what they want or need.

The amount of data created is growing faster than ever before, and it's predicted that by 2020 around 1.7 megabytes of new information will be created every second for each person alive. Showing the impact of the digital revolution, more data has been created in the past two years than in the entire previous history of the human race.

However, at the moment, just 0.5 per cent of all data is analysed, showing that there are endless possibilities for where Big Data could go in the future.

What's the problem?

With such a large amount of data available, it's difficult to see why more companies aren't using this information to better help them achieve their business goals. However, what many organisations lack is the infrastructure or capabilities to capture and analyse the data available.

Global technology giant IBM could be helping to solve this problem, and as such, push companies closer to realising the full potential of Big Data.

The company recently announced the launch of a series of new servers that have been designed to help "propel cognitive workloads and drive greater data center efficiency".

Featuring a new chip, the Linux-based lineup carries a number of innovations from the OpenPOWER community to deliver higher levels of performance and computing efficiency.

Early tests by China-based internet service provider Tencent found that the new IBM OpenPOWER servers were able to run data-intensive workloads three times faster than its former x86-based infrastructure.

This was achieved while also reducing the total number of servers used by two-thirds, suggesting a great number of cost benefits to the faster performance.

With these industry-leading statistics, IBM has already received a number of orders from several businesses, research organisations and government bodies. Some of the first to get their IBM order will be a large multinational retail corporation and the US Department of Energy's Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL).

"The user insights and the business value you can deliver with advanced analytics, machine learning and artificial intelligence is increasingly gated by performance. Accelerated computing that can really drive big data workloads will become foundational in the cognitive era," Doug Balog, general manager for IBM Power Systems, said.