Stats is the New Business Intelligence: More Powerful Analytics Enable Better Decision Making
Statistical analysis is a complex enough subject to comprise its own higher education degrees and research areas that further the discipline in multiple directions. At the bleeding edge of research-led studies, it’s highly esoteric, with complicated mathematical modelling and dedicated computer languages to help automate the more intensive number-crunching aspects in the practice.
Most people think that stats in business contexts are somehow only for those trained and highly skilled in this field. While that has been the case historically, new generations of software platforms today offer organisations even of modest size the tools and platforms that leverage advanced statistical analysis and data processing to surface insights that can be used in everyday decision making.
The modern business produces and is capable of ingesting huge amounts of data, every byte of which has the potential to change the course of the organisation’s strategies, products, and commercial offerings. Drawing down actionable insights from the potential petabytes that businesses create daily requires specific capabilities, including going well beyond statistical processing.
This article will look at some of the outcomes that companies can expect from leveraging this new generation of statistical analysis and processing software and look at the supporting processes that enable this.
The first element to consider is DATA COLLATION. Most organisations operate on multiple hardware and software platforms, collecting and processing data from thousands of sources. These range from static data stored on local servers (or even employee workstations) through to the databases and information storage that drive business applications and services (on-premise and in the cloud) and even encompass archived data and that available from third parties.
However, the fact is that much of this information exists in data silos or application-based silos and therein lies the need for a facility that can collate. Data asset discovery and cataloguing is an important first step but one that must be continuous — the shape of the organisation’s data resources changes constantly.
Today’s software is capable of varying degrees of processing data in its different formats and types, including unformatted and formatted data, information in proprietary formats, and data that needs third-party processing (like OCR, handwriting recognition, voice-to-text processing, and similar technologies).
ANALYSIS of data and the creation of models inside analysis applications can yield both passive outcomes (what happened) and proactive outcomes (what will happen in the future to what degree of probability). It’s in this regard that business-focused statistical analysis packages yield their worth — here is where actionable insights can be drawn, with proposed changes to processes and strategies actively modelled and assessed.
The PRESENTATION of these facts and predictions is critical in commercial contexts because — difficult to believe — not every employee is a trained statistician or data scientist. While many organisations employ data pros, business decision-makers with postgraduate degrees in statistics are thin on the ground. The software platform, therefore, needs to be accessible and usable by a broad church of users.
(It is worth noting as an aside that the solutions we feature below can encompass the methods and favoured languages of the professional statistician-come-data-scientist too, but not exclusively.)
The final step in the equation is the EXECUTION of change, either in the form of existing process improvements based on the results drawn or the suggestion of new markets, or products, that the data has suggested there is a need for.
To give a single example, an analysis might find that customers only use a phone app on certain occasions to achieve a specific aim. Analysis could provide answers as to why and unearth possible ways to increase engagement with the app. It might transpire that by leveraging the most popular message channels at key moments, the messages that have had the best traction in the past can be sent — both of which elements can being identified statistically.
The iterative nature of these processes needs to be stressed — today’s markets change daily, and the processes of discovery, parsing, analysis, insight, and action need to be continuous.
The following are four vendors acting in this space that we at Tech Wire Asia think are worthy of your consideration to help organisations with the data insights they need.
MINITAB
For nearly 50 years, Minitab has been helping organizations leverage the power of statistics to gain real, value-creating insights. Unlike other advanced analytics packages, Minitab offers a portfolio targeted specifically at business users, not just data scientists. That means that everyone can have the power of statistics and deep data analysis at their fingertips — it’s Machine Learning made easy for the rest of us.
For those that need a more tailored approach, Minitab’s Solutions AnalyticsTM integrates software and services, making it a perfect partner for those committed to improvement and change management, no matter where they are on their transformation journey.
Often, one of the biggest challenges to data analysis isn’t the analysis; it’s data silos making data inaccessible. Minitab helps organizations unify their data from collection, integration, and preparation to analysis. For more on Minitab’s Connect Platform, click here.
Minitab’s interface creates fast and intelligent representations of elements’ inter-relationships, showing hidden trends, causes, and effects, the impact of changing variables, and predictive insights.
While the full suite’s capabilities remain for the advanced data professional, the approachable interface makes this an invaluable business tool at any level.
We have a deeper dive into Minitab here on the pages of Tech Wire Asia; click through to read more.
DATAROBOT
Based in Boston, Massachusetts, the Data Robot offering isn’t intended to replace the data scientist but to make the data professional’s work a great deal easier. That’s achieved by automating the development and deployment of machine learning models in commercially sensitive settings.
The emphasis on the business underpinnings of the modern data professional is a significant differential, as platforms developed for research into cognitive routines and machine learning were never conceived for specifically commercial use.
It’s this focus that’s attracted US$225 million from investors such as New Enterprise Associates, Sapphire Ventures, Meritech and DFJ, and the investment is clearly paying dividends.
In Australasia, for example, DataRobot has worked with disruptive finance house Harmoney (a good representative of new-generation banks, as 60 per cent of its staff are described as “technically-focused”) to empower the company’s data scientists to achieve more than they might otherwise have been capable of.
Results are not measured by the descriptions of exciting new, experimental concepts of machine learning, but rather in the effects the work has had on the bank’s customers: better value for borrowers, lower default risks for lenders, and an increasing market share for Harmoney.
Read more about Data Robot here on Tech Wire Asia.
SNOWFLAKE
Snowflake is slightly different from some of the vendors featured here as it focuses on centralising and gathering data into a so-called “data lake”, into which queries may be made. However, as an industry-recognised powerhouse in this sector, we thought its inclusion valid.
The Snowflake platform runs as-a-service, with the cloud-based provision offering — in one platform — both scalable compute, data storage and client/query handling. Compute operations are divided and load-balanced on the fly, making it ideal for processes with inherent peaks.
Handling burst demand seamlessly via its cloud provisions means that it’s a popular choice in settings where strict adherence to market trends is paramount, and there’s little tolerance for slow-down in critical systems, whatever the throughput load. Working with Snowflake is therefore, for many data scientists, a standard part of the working day.
The data core of any organisation can be replicated on the fly for business continuity assurances, which makes it a good solution for any service operating under SLAs.
With granular scalability of workloads, Snowflake is ideal for businesses looking for data processing as-a-service. Read more here.
ALTERYX
There are several aspects to the Alteryx platform that make it pretty special in this space. For data analysis professionals at any level, there’s a massive amount of thought gone into integration with the tools and platforms already at hand and in daily use. For the more business-focused professionals, that integrative approach extends to the rest of the technology stack that’s usually in play at enterprise level and in cloud providers via common tools found all over the various business functions.
The platform is big on collaboration, making searching for data and analysis assets quicker, and this open, community basis means there’s little chance of different teams replicating one another’s work. Data sets can be prepped and blended by those happy with writing their own R or those who are entirely code averse.
The collaborative approach to big data in Alteryx is very much suited to the enterprise setting: it is data analytics for big organisations, not for isolated data science teams operating in ivory towers! You can read more here on the pages of Tech Wire Asia — just click the link.
*Some of the companies featured on this article are commercial partners of Tech Wire Asia
READ MORE
- Data Strategies That Dictate Legacy Overhaul Methods for Established Banks
- Securing Data: A Guide to Navigating Australian Privacy Regulations
- Ethical Threads: Transforming Fashion with Trust and Transparency
- Top 5 Drivers Shaping IT Budgets This Financial Year
- Beyond Connectivity: How Wireless Site Surveys Enhance Tomorrow’s Business Network