Activate Your Data, People and Processes
Transform big datasets into a clear, compelling story with actionable insights:
Activate Your Data: Enable data-driven decision making and action at the speed of the business with modern data pipelines and cataloging.
Activate Your People: Raise data literacy in your business through machine-generated insights, conversational analytics and multi-user collaboration.
Activate Your Processes: Trigger immediate action and orchestration of business processes, with exception reporting to ensure no step in the process is missed.
Don't just inform action, compel action!
What is Data Analytics?
Data analytics is the process of collecting, analysing, and interpreting data to gain insights. It enables organisations to more quickly spot trends, anomalies and correlations in their data that, in turn, drive better-informed decision making.
The ultimate aim is to provide users with the information they need, when they need it, optimising every business moment through informed action.
What are the primary types of Data Analytics?
Descriptive - What happened?
The purpose of descriptive analytics is to state what has happened. The aim is to provide an easily digestible snapshot.
Diagnostic - Why did it happen?
The purpose of diagnostic analytics is to delve deeper to understand why something has happened.
Predictive - What is likely to happen?
The purpose of predictive analytics is to forecast what is likely to happen in the future enabling organisations to plan ahead.
Prescriptive - What should our next move be?
The purpose of prescriptive analytics is make suggestions on how to take advantage of what has been predicted.
How we deliver value through data analytics
One of our favourite success stories had a single but demanding goal - improve the profitability of the business.
The business challenge this manufacturer had was seeing the wood for the trees. They had many product lines, and thousands of product configurations. Ultimately, they wanted to be able to identify low value, low volume items, as well as items with high defect rates or long production times that were impacting the overall output.
Step One: Defining the Question
To begin, we got the customer to define the questions they wanted answered and we worked back from there - utilising the questions as a guiding principle to identify the metrics, and from those metrics deriving the data sources required.
It's important that customers don't see this as a one-off, must get it right first time, process. Rather, ask the easy questions and identify as few metrics to warrant a minimal viable product. We find it's highly beneficial to go through the first iteration quickly, as this will deliver immediate value and often sparks the imagination.
Step Two: Data Pipeline Creation
As we knew we didn't have the full picture from the offset, we took an ETL approach, and advised not to implement a data warehouse at this point. Instead, we implemented an agile semantic layer within the data analytics platform.
There were 3 advantages to this:
Efficiency: ETL offered a 'single step' from source to dashboard, and due to the volumes of data only being in the tens of millions, reloads were still quick.
Agility: As our data pipelines and visualisations were built in a single platform, we could work in short iterations, with confidence we could make rapid changes throughout the discovery phase.
Cost effective: We weren't incurring cloud storage costs and so we could bring in any data, at any velocity, without having to perform a value justification exercise.
Step Three: Data Profiling and Visualisation
"Let's just get the data in, and visualised!" The initial visuals didn't need to be pretty, we even resorted to basic tables in some cases. The focus here is on profiling the data, identifying data quality issues or errors with in applied logic from the data pipeline. It's worth highlighting that applying fixes can require business input, which can take time, and this is why we aim to get to this stage as soon as possible.
Once the data issues were resolved, the dashboard can begin to take shape. Our design philosophy is to keep things simple, especially if the business is not mature in their data journey and data literacy is low.
Also, we recommend to pay equal attention to UX (user experience) principles to aid the user to navigate through the platform and data easily. Following the initial dashboards being built, the return on investment started to become obvious. Through visualising the data, it became very clear which 'blocks of the Jenga tower' needed to be removed.
Step Four: Productionise
It’s likely by the time we’ve reached this step that you’ve gone through a few iterations of the above, that’s good! It means data has been checked, dashboards have been refined and value has been identified through the questions your are now able to be answer.
However, before we share dashboards with the users, it’s worth doing those ‘nice to haves’ to ensure what’s been built is the best it can be, for example:
- Intuitive and consistent labelling
- Apply and use colour correctly
- Optimise and comment scripts and expressions
- Add descriptions to metrics, and metrics to your business glossaries.
This is a non-exhaustive list but these tasks have one thing in common, they stall real progress! That's because they commonly get refined, added to or removed altogether. Therefore, wait until the dust from the development cycle has settled before finalising them.
Step Five: User Onboarding
Although this is the last step, we are actually only 50% through the journey. That’s because stakeholder buy-in and user enablement is equally as important as the sum of the previous steps.
To get the most out of the data, and a data platform, businesses need to establish a culture around data, and in part this requires a comprehensive training programme and persistence. It’s not a one-and-done action.
To ensure this project was successful, we garnered top and middle management buy-in, and provided a bespoke training subscription to ensure there were regular training sessions (monthly), drop-in workshop (bi-weekly), email support (continuous) and frequently communicated updates (ad-hoc), with useful tips and tricks. This multi-channel approach keeps the data platform in the forefront of users minds, establishing a data-driven culture.
Qlik Cloud Analytics Platform
Qlik’s best-in-class associative engine is the only analytical engine capable of answering all the questions being asked. This is because, unlike query based tools like PowerBI or Tableau, Qlik is able to answer the ‘not’ questions without having to be specifically coded to do so- ‘Which products didn’t we sell over the last 12 months?’.
Qlik Cloud Analytics is also a comprehensive modern data platform, here are a few of the capabilities we utilised for this project:
- ETL data pipelines
- Analytics catalogue
- Semantic layer, utilising QVDs.
- Data visualisation
- Governed metrics and dimensions
- Business Glossaries
- Collaborative notes
- Static email reporting, using Subscriptions
- Data alerts
- Dedicated IOS and Android mobile apps
Schedule a Demo
What can you expect?
In short, we offer a simple, honest and interactive demo.
A typical agenda:
- Introductions (5 minutes)
- What are your current challenges (5 minutes)
- High-level slides inc. pricing (5 minutes)
- Product demo and continuous Q&A (40 minutes)
- An in-depth technical demo
- A link to pricing and a bespoke proposal
But, this is your meeting, your time, so feel free to focus us on what you need from the call.
We're experts in the full end-to-end data workflow
The UK's Largest Dedicated Elite Qlik Partner
With a team of highly skilled and certified consultants specialising in Qlik, we guarantee that you will gain valuable insights and experience the efficiency and reliability with which we complete projects.
We take great pride in our customer-centric approach, offering flexibility and pragmatism in every interaction. We are committed to working on your terms, in the capacity that best suits your needs, to maximise your success.
A Full Range of
Your one-stop shop for all data services:
- Data Engineering
- Data Integration
- Data Visualisation
- Data Science
- Project Management
A Catalogue of Up-to-Date
Qlik Training Courses
With a former Qlik trainer in the team, we've written an entire catalogue of courses to get you up-to-speed with the Qlik platform, as well as educating users on the fundamentals of data literacy. Our courses are always kept up-to-date and relevant.
Qlik isn't just part of our business, it's the whole business. Every member of our support team are certified on the Qlik platform. We'll advise on best practices, help diagnose bugs, and take a proactive stance towards a fast resolution. We are the go-to partner for Qlik support.
Agnostic to Source
and Target Systems
We embrace the diversity of source and target systems, adapting seamlessly to the ones you use or have in mind for the future. Our dedicated team will accompany you on your data journey, guaranteeing we accomplish the desired business outcomes you have established.