Alternative Text Sophie Nadin | 15 September 2021 |

The future of analytics: 5 key areas to look to for change

So, where will we be in the next five, 10 or 15 years? What will become reality and what will remain just a possibility? To get some insight into what we can expect in the future of analytics, Julian Shaw, Principal Consultant at DeeperThanBlue examines five areas where we know there certainly will be changes.

 

Will the limits of computing cope?

What happens when current computer speeds hold back development? How far do you go with developing the applications that run on hardware that is no longer up to the task? What comes first, the machines that can create a perfect egg, or the software that can design it?

Which is why tech companies are investing $billions into quantum computers and AI rather than directly on systems development. The latter still exists, in eye-wateringly high amounts really, but that’s not the main direction.

In 2021, it’s estimated the world will produce 74,000,000,000 gigabytes (74 zettabytes) of data. This is up from 59 zb in 2020 and 41 zb in 2019. All that data to be analysed – if the machines doing the analysing can handle it!

Chances are the machines will cope. A huge amount of research and development is underway to expand the limits of computer power in a way that will change the world. Quantum computing has for some time been known as the next step forward. In short, it will allow computers to run exponentially faster, using calculations such as Grover’s Algorithm, which cannot work on today’s machines.

This will be combined with the development of materials such as graphene that, amongst other attributes, can act as a superconductor. This essentially means less power is needed, which increases speed, and that allows quantum computing to run at the speeds it is capable of.

And that’s the future for analytics. Not in and of itself, but in the development of new hyper-fast computers first, with the software to run on them coming after.

 

We need to up our game to widen the scope of our analytics

When it comes to data and analytics, we simply can’t live in isolation. At a macro level, global impacts, geo-politics, pandemics, wars, trade (dis)agreements, etc. affect our analytics of localised data. At a micro level, a marketing push affects our financial bottom line, and our performance analysis.

This has always been the case of course. What about in the future though? We’ve already seen that the sheer volume of information is astounding, and growing year on year, and it simply won’t stop. And, if, as a data specialist I’m already aware a narrow lens does not focus properly, what will change?

Basically, this view won’t be an esoteric understanding for long though. People will become more data savvy, at which point we will have to up our game to widen the scope of our analytics.

Already we can tap into social media and other data sources using APIs, and web-traffic information (plus underlaying demographic data) has been available for a while. Which helps us analyse and predict out with an ever-growing set of input data.

What happens next is the big question. How far will we have to connect the dots?

 

A move towards more holistic systems

There have been massive changes in the way disparate data held within an organisation is opened across all departments or sections. We all know the dramatic impact this has already had. So, what’s next?

We are already looking at data from all areas of an organisation, to see the correlation and causation in values – Data Science across the whole estate in other words. But this is only looking AT the data, rather than being part of the data.

What I suspect will come is a move towards holistic systems, where all the component parts are designed to interact with each other. This will come in several ways:

  • Technologies such as Cognos Planning Analytics will allow information to be fed back into the data stream, updating the data itself using end user inputs. This then generates further information.
  • Machine Learning and AI tools will begin to be incorporated into the whole data sphere. Computer generated ‘new’ information will flow back into the ‘pot’, to be used elsewhere in turn. How does the current data affect the future, and how does that affect other areas?
  • Homogenised data structure will become the norm, eco-systems built from the ground up, where disparate elements can ‘talk’ to each other.

In future, data points will connect more, and, being built from day one to conform, they will have similar structures, whilst AI, Machine Learning and write-back capabilities will mean branches will form backwards as well as forwards.

 

The Internet of Things and the data that powers it

The Internet of Things already connects everyday appliances and items through the internet. Your fridge orders your groceries and smart doorbells let you to speak to visitors wherever you are, for example.

But what happens to the data generated? It’s not a dystopian fantasy that a computer somewhere will be able to compute what you’ll be ordering next month – and all that data will be analysed.

Specialised systems will be developed to facilitate that analysis. How quickly that happens is anyone’s guess, but it’s probably going to be down to uptake speed of connected appliance technology. Think about it though, we’re all happy to have our phones passing our data, how soon will it be before our ovens, boilers, doorbells and televisions do the same?

 

The Rubber Band Effect

Over time, even as technology advances the real-world use of it does not. Imagine the gap being bound by an elastic band. As it gets bigger, then the band tightens, and the pull on the real-world increases. Until, eventually, it becomes such a force that, in a rush, the new technology is taken on by exponentially more users.

Take Windows 10 for example. It was not widely taken up – the sheer pain of upgrading and its initially buggy reputation probably put paid to that. It remained the exception rather than the norm. Until, that is, the critical mass of its benefits and the need to overcome older versions’ drawbacks meant that it seemed to be all-of-a-sudden the system.

This is of course facilitated by deprecating systems, which definitely happened in the case of Windows 10. And that obsolescence is justified by advances that fulfil Moore’s Law, and, in the case of data, also the need for ever-wider fields of view as increasing numbers of interconnected data sources come on stream.

The upshot is that, generally, when technology is developed, with massive budgets underpinning it, then usually that technology becomes prevalent.

Related Articles

These might interest you

Analytics - 28 January 2021

7 things we learned about analytics in 2020

There’s no doubt about it, 2020 turned the world upside down. Some businesses thrived, all faced unprecedented challenges, and some Read More
Analytics, Company News, In The Media - 30 July 2021

How did we architect our Euro 2020 prediction models?

By popular demand, we’ve decided to let you into our world and give you a little insight into the technology Read More
Analytics, Events - 08 September 2021

KPI guru Bernie Smith joins the BAF speaker line-up

KPI guru Bernie Smith has been announced as the first speaker for DeeperThanBlue’s Business Analytics Forum, happening on Wednesday, October Read More