ActiveViam

Nordic Capital, a leading sector-specialist private equity investor, has made a majority investment in ActiveViam | READ MORE

Six Trends Shaping the Future of Analytics

ActiveViam |
November 12, 2020

The recent health and economic crisis remind us that instability is a constant and that crisis situations tend to overlap rather than follow one another. The ability to dynamically draw insights from your data to adapt and make the best decisions is a major asset of resilient companies. At the same time, the volumes of data to be analyzed to achieve this goal are increasingly dense and volatile. Technology, including ActiveViam’s, must constantly evolve to provide ever more precision and meet the needs of the “new normal”.

To provide some perspective, here are 6 trends that will shape the future of analytics.

The Rise of Data Science

Rapid advances in data science are opening up new possibilities for businesses. Data science can deepen their knowledge of their market, their customers and their business and identify new opportunities.

Artificial intelligence automates difficult tasks, shaves down costs and increases productivity in all areas: operations, marketing and sales.

The use of data science to enrich datasets helps identify new axes of analysis and groups them according to “patterns” so seemingly mismatched categories can be joined to provide fresh insight.

This information has a direct impact on performance and will feed next  generation Business Intelligence tools.

To reach its full potential, the use of data science must spread to all departments of the company.  Several tech giants are investing heavily to train their employees in data science, even when they are not specifically part of a team of data scientists. Recently Airbnb has created its own Data University, in 2018 Facebook announced it would train 65 000 people in France and invest $8.9 million into artificial intelligence by 2022. Amazon has embarked on the development of its artificial intelligence Echo / Alexa, as well as the acquisition of multiple startups and the recruitment of hundreds of experts.

Decompartmentalization of Analytical Tools:

We know the risk of seeing models built on historical data being turned upside down overnight. Data analysis must be part of an agile dynamic. . Having a different tool and programming language for each team is a serious obstacle to achieving this responsiveness.

The Python language, which has been widely democratized, is now becoming the common denominator  among all data players: providers, quants or data scientists, and users.  Data scientists can create tools or discover insights that can be used by business users in a very short time. Python is particularly popular for data analysis and artificial intelligence, but also for backend web development and scientific computing. For all these reasons, in 2019 it recorded the strongest increase in terms of use according to the TIOBE Index.

Read also: Bringing interactive visualization to Python notebooks

The Death of Reporting and the Advent of Real-Time Interactive Analysis.

Until recently, all BI technologies consisted of extracting data from databases and generating static reports that ran at night and were available the next morning or even several days after. Any modification of the basic parameters or request for precision required restarting the process from scratch. This is still happening in many companies.

Having a real-time analytical system makes it possible to work with up-to-date figures. If the first to ask for this ability  were traders and risk controllers in finance (in order to visualize risks in real time, study the churn …), the need is now spreading to all business lines.

Beyond the real-time view of a situation, they want to be able to simulate many hypotheses and calculate their impact on KPIs in advance (e.g. what will be the impact if a particular customer cancels his order? / the train does not deliver / the demand drops by 10% …), and this is perhaps even more important than having a “live” dashboard.

When making decisions, it is no longer advisable  to wait ages for data that will already be outdated when it becomes available. There is a real need to be able to benefit from fresh predictive analysis and to be able to project yourself. Among the major drivers that make this evolution possible: in-memory (which increases the calculation speed of tools tenfold) as well as cloud native technologies which allow the use of resources on the fly, according to your needs, rather than having to rely on expensive servers.

Read also: Advanced In-Memory Analytics in Azure with the ActiveViam platform

The Importance of UX Design.

Users not only on the technical side but also on  business teams, will have to navigate around  much more extensive data than before and will have to quickly draw insights and make decisions from constantly changing data. UX must therefore absolutely be integrated into tools: they must become intuitive and attractive (where they were austere and static).

Items like the following will be the figureheads of innovations in this direction:

  • “Data storytelling”, that is to say visually giving meaning to the data,
  • The interactivity of analytical platforms and simplified access to more information (being able to zoom in on data, filter, update in real time, etc.)
  • natural language processing to query data and obtain relevant results faster

The Crossing of all Types of Data

The historic organization of  independent silos, in effect in many companies, has also created technological silos. For example, you can find banks with thousands of databases built on dozens of different systems stacked up over the years, plus countless Excel sheets circulating  among  employees.

Obtaining information on the state of the whole company from systems – CRM, ERP, TMS, WMS,… – which do not or poorly  communicate is extremely complex and slows down IT teams, which costs time and money   to solve the resulting problems rather than offering innovative technologies to employees.

If these architectural issues are not addressed, businesses will not be able to reap the full benefits of the latest innovations in data science, AI, or machine learning. Their evolution will be partial and  limited to particular subjects, rather than encompassing all the functions of the company.

The Rapid Evolution of Mobile Use.

It has been a long time now that smartphones are no longer just for communication. They are terminals in their own right for using complex applications, including analytical ones, the use of which must be able to adapt to their limitations. These new work habits must be taken into account in the creation of today’s analytical tools.

Source: Statcounter (accessed January 2020) Figures represent each device share of web pages served to web browsers only

Other articles on similar topics:

Like this post? Please share it on your socials

Schedule a demo

ERROR!

Sorry! We were unable to process your
request. Please try again!

success!

Your request is submitted successfully!
We will keep you up-to-date.