Work smarter without new tools: Power Platform in action

In this online masterclass, we’ll show you how to streamline existing processes and make them future-proof. No complex IT projects—just practical solutions within your current Microsoft environment.

Many decision-makers are unaware of the full potential of Microsoft Power Platform, and that’s a missed opportunity. This platform offers countless practical applications to optimize and digitize business processes.

The good news? Chances are, you already have it!

But are you using Power Platform to its full potential? Without a structured approach, its components are often used in isolation, limiting their impact. Curious about how your organization can unlock more value from technology you already own? In this masterclass, we’ll show you how.

Power Platform in Action: 3 Use Cases, 1 Masterclass

In this session, we’ll take you behind the scenes and show you exactly how different Power Platform applications work together to streamline business processes. No theory—just a deep dive into three powerful implementations:

  • Order2Cash – From invoice to payment without the hassle. Automate invoicing, payment reminders, and cash flow analysis with Power Automate, Power BI, and Power Apps. Fewer errors, faster payments.
  • OnboardingApp – Give new employees a smooth start. Automate system access, equipment requests, and training with smart workflows and data-driven insights.
  • Customer Order Portal – Fewer emails, more control. Let customers check their orders and statuses themselves, fully integrated with your backend systems. More efficiency for them, less manual work for you.

You’ll also discover how Copilot enhances automation with AI-driven intelligence. Copilot helps you leverage data more effectively, speed up processes, and improve workflows proactively—without adding complexity.

What’s next after this webinar?

That depends on where your organization stands today. Here are three common paths we see with our clients:

1. Do nothing – and keep struggling with emails and Excel.

Sounds a bit harsh, but it’s what many teams end up doing (often unintentionally). Everything stays the same: manual processes, error-prone tasks, and lost time.

2. Start a small-scale pilot on your own.

You’ve got the ideas—now it’s time to act. Start with one painful process—think onboarding, order handling, or reporting. Build your own solution with Power Automate or Power Apps and learn as you go.
👉 Tip: look for someone on your team who already knows their way around Power BI or SharePoint—they’re often up and running faster than expected.

3. Map your opportunities together with us.

We’d love to help you figure out:
📌 Where the biggest wins are
⚙️ Which processes can be smartly automated using what you already have
💡 How to make sure users actually start using it

👉 Plan a no-strings-attached session – and let’s explore what works best for your situation.

Sign up to get instant access to the webinar!

You're in!

  • Enjoy this webinar

Verder praten over dit onderwerp?

Neem contact op met Alexander!

AI Agents: Your New Colleague in Data Analysis

Making data-driven decisions doesn’t have to be complicated. With AI agents, you no longer need endless dashboards or manual reporting. These digital colleagues deliver insights within seconds, spot trends, and even build your presentation. In this article, you’ll discover how AI agents like Zebra BI and Microsoft Fabric Data Agent are already transforming the way organizations work with data.

We’ve all been there. You open an Excel file, see a mountain of sales data, and wonder: Where do I start? Now imagine skipping the filtering, clicking, and chart building. Imagine pressing a button and instantly getting a clear overview, complete with insights and ready-made slides for your next meeting. Good news: that’s not the future, it’s already possible. Thanks to AI agents.

Stop drowning in data, start using it

The concept of data-driven work has been around for years, but reality often lags behind. According to 360Suite, only 26% of employees effectively use their BI tools. Not because they don’t want to, but because it remains complex. Reports take time to create. Extracting insights requires expertise. And as a result, decisions get delayed.

That’s exactly where an AI agent helps. It makes data accessible, understandable, and immediately actionable—in seconds.

What exactly is an AI Agent?

It’s not a robot. It’s not a chatbot. It’s simply a smart digital colleague that helps you make sense of your data. Zebra BI is a perfect example. You upload your dataset, and within moments, you get a visual overview with clear conclusions. The tool identifies trends, suggests next steps, and even helps build your presentation.

Microsoft Fabric Data Agent takes it even further. It monitors your data in real-time. Think of it as a colleague who immediately alerts you when something unusual happens, like a sudden drop in revenue. It provides context and recommendations, so you no longer depend on monthly reports. You can take action right away.

The best part? You interact with these tools as you would with a colleague. Just type, “Why has the margin dropped in the South region?” and you get a clear, straightforward answer.

What does this mean for data professionals?

AI doesn’t replace your work, it makes it better. You spend less time building dashboards and more time where it matters: helping colleagues make smart decisions. Your role shifts from builder to advisor, from creator to coach.

You ensure the data is accurate. You help shape the right questions. And you make the difference between simply reporting and actually improving.

What does this mean for your organization?

AI agents have a real impact. They reduce the pressure on scarce specialists. They speed up decision-making. And most importantly, they help your organization become more agile. When data reaches the right people faster, they can act faster. It’s that simple.

According to Accenture, employees lose an average of five workdays per year due to poor data literacy. That’s a full workweek per person, time and money wasted.

Meanwhile, G2 reports that the use of BI solutions has increased by 49% since the pandemic. The need is clear. The tools exist. The only thing left is to use them wisely.

The future is already here

AI agents are reshaping how we work. They make data accessible. They help you make faster decisions. And they lead to better conversations, better choices, and better results. You don’t have to wait until you’re “ready.” You can start today.

Want to see how close this future really is? Watch the video below, where the CEO of Zebra BI demonstrates how it works; live, simple, and no gimmicks.

  • Video: AI Agents with Zebra BI

    By Andrej Lapajne, Founder & CEO of Zebra BI, during our Data & Automation Pitstop 2025.

Ready to experience it for yourself?

Curious how AI agents could help your team? Schedule an introductory chat with Alexander. No sales pitch. Just an open conversation about how you can work with data faster and smarter. The technology is ready, now it’s your move.

How Schiphol puts data at the heart of every key-decision

Schiphol Group is transforming how decisions are made; by putting data at the centre of daily operations. In this article, Maarten van den Outenaar explains how the airport empowers employees, aligns data with strategy, and navigates technology adoption at the right pace.

How can organizations ensure every critical decision is genuinely supported by data? Maarten van den Outenaar, Chief Data Officer at Schiphol Group, emphasizes a strategic vision built around empowering employees to integrate data into their daily decisions. Schiphol’s vision “Data at the heart of every key decision”, reflects a broader organizational transformation towards data-driven decision-making.

Why data-driven decisions matter

Organizations continuously face the challenge of timing their technology adoption effectively. As history shows through various industrial revolutions, adopting technology too early or too late both carry risks. Schiphol recognized that integrating data effectively into the organization’s decision-making processes required a balanced and strategic approach.

Van den Outenaar highlights three primary ways data can improve organizational outcomes:

  • Efficiency: Streamlining existing processes.
  • Effectiveness: Enhancing decision quality and outcomes.
  • Diversification: Applying successful practices across various organizational domains.

Schiphol’s strategic vision: Autonomous Airside

Schiphol Group’s strategic goal, encapsulated as “Connecting your world with the most sustainable and high-quality airport,” focuses on quality network connections, customer service excellence, employee satisfaction, community engagement, and sustainability.

Central to this vision is the concept of an autonomous airside: a fully electrified airport area leveraging data and IT to minimize environmental impact and optimize operational efficiency. However, reaching full decision automation remains cautious, with human judgment playing a vital role even at advanced stages.

Aligning data strategy with organizational decisions

To achieve meaningful integration of data into everyday decisions, Schiphol approached departmental leaders, asking them to identify their top three decisions. This approach ensured:

  • Clear prioritization of data initiatives.
  • Strong alignment with strategic objectives.
  • Practical focus on measurable outcomes.

By clearly defining and supporting key decisions, Schiphol ensures that data initiatives genuinely impact strategic objectives.

Empowering employees through data

Effective data strategies require putting employees at the center. Employees are empowered through education, support, and the integration of digital expertise from Schiphol’s Center of Excellence. Van den Outenaar emphasizes a collaborative approach, combining technical knowledge with domain expertise to effectively bridge the gap between rapid technological developments and employee adoption rates.

Navigating technology adoption

Organizations face continuous pressure from rapidly advancing technologies, often outpacing organizational readiness. Schiphol tackles this issue using the S-curve model—common in investment circles—to carefully time technology adoption. The S-curve approach:

  • Guides incremental maturity growth in data capabilities.
  • Enables timely assessments and adjustments.
  • Facilitates agile, strategic progression towards data maturity.

This incremental and evaluative approach ensures organizations adapt at a sustainable pace, avoiding technology fatigue and resource misallocation.

Matching employee needs with technological capabilities

Successful implementation of data initiatives relies on identifying and prioritizing employee tasks that most benefit from technological assistance. Schiphol pairs employee needs with technological potential, creating actionable matrices that highlight optimal implementation opportunities.

Key considerations include:

  • Identifying tasks suitable for automation or enhancement.
  • Recognizing employee openness to technology.
  • Prioritizing initiatives with significant impact potential and high employee enthusiasm.

Practical Impact and Real-world Application

Van den Outenaar underscores the importance of tangible improvements. By clearly demonstrating how data initiatives enhance employee performance and organizational outcomes, Schiphol achieves broader organizational buy-in. Specific reports and dashboards illustrate the practical benefits directly linked to employee needs, reinforcing strategic alignment.

Continuous Improvement through strategic alignment

Schiphol’s approach to data strategy highlights the importance of continuous improvement through strategic alignment, employee empowerment, and agile technology adoption. By placing data at the heart of key decisions and strategically aligning technological maturity with organizational readiness, Schiphol demonstrates a robust blueprint for successful data integration.

Curious where your organization stands when it comes to data-driven decision-making?

Request a free Data Maturity Scan and discover where your biggest opportunities lie. Alexander is happy to walk you through it.

Artificial Intelligence is ready!

In this interview, Jonathan Aardema talks with Prof. Eric Postma (professor of Cognitive Science and Artificial Intelligence at the University of Tilburg) about the why, how, and what of artificial intelligence applications. What do we see in practice, and what does science say about it? 

Read more

Visiting London for the Tableau Partner Executive Kick-Off 2020

Every year Tableau invites its most valuable partners to kick off the new year together. The theme for this year was Accelerate, so let’s get right to the point. This exciting event was focused on three main areas.

Read more

Mastering DAX

Keeping your skills up to date is crucial when you work with the newest technology. At Rockfeather, we challenge each other to be the best version of yourself. That’s why I attended the mastering DAX course. DAX (Data Analysis Expressions) is a formula expression language. Next to Power BI, DAX is applied in Excel Power Pivot and tabular models in SQL Server. Learn it once, use it tomorrow.

Read more
All posts

How do you choose the right KPIs, without losing sight of your strategic goals?

Selecting effective Key Performance Indicators (KPIs) is often more complicated than it appears. Organizations frequently find themselves measuring numerous KPIs without achieving clarity on performance or strategy alignment.

Prefer watching to reading? Check out the video at the top of this page to hear Bernie Smith explain it himself.

Bernie Smith, a recognized expert in KPI development, emphasizes this challenge through a compelling analogy: poorly coordinated decisions, much like a street renovation immediately followed by tree removal, lead to meaningless outcomes despite good intentions and substantial investments.

The Persistent Challenge: Aligning KPIs with Strategy

Smith notes a recurring issue in organizations, the disconnection between chosen KPIs and strategic objectives. Many KPIs, although individually logical, fail collectively because they are not aligned or prioritized correctly. This misalignment results in wasted effort, resources, and ultimately confusion rather than clarity.

To address this challenge, Smith developed the Results Orientated KPI System (ROKS), known informally as the ROKS method, a structured seven-step process focused on aligning KPIs directly with strategic outcomes.

Introducing the KPI Tree Concept

Central to the ROKS method is the idea of KPI Trees, an evolution of the traditional strategy mapping approach popularized by Kaplan. KPI Trees visually break down broad strategic objectives into specific, measurable elements, enabling better alignment and clarity.

Smith illustrates the KPI Tree method with a simple personal example: aiming for good health. The top-level strategic objective—being healthy—is broadly accepted but not directly measurable. Breaking it down, Smith identifies measurable components such as exercise frequency, sleep quality, and dietary habits. Each subsequent level adds specificity, eventually reaching clearly measurable KPIs, like daily calorie intake.

Benefits of KPI Trees:

  • Clarity and Simplicity: Complex strategies become clear and actionable.
  • Alignment and Engagement: Collaborative construction of KPI Trees fosters consensus and team buy-in.
  • Visualizing Conflicts: Clearly illustrates potential conflicts between KPIs, ensuring balanced decision-making.

Dealing with KPI Overload: The Importance of Shortlisting

Despite their advantages, KPI Trees can lead to an overwhelming number of potential KPIs. Smith identifies two typical mindsets regarding KPIs:

  • Dreamers: Those inclined to measure everything imaginable.
  • Pragmatists: Those cautious about adding KPIs, mindful of measurement complexity and workload.

To manage this tension, the ROKS method employs a rigorous shortlisting process, prioritizing KPIs based on their strategic importance and ease of measurement. This critical step ensures that only relevant, manageable, and impactful KPIs remain.

Practical Scalability and Adaptability

A significant advantage of KPI Trees and the ROKS method is their scalability. They are designed for adaptability across different organizational contexts—from banks and universities to manufacturing firms. Smith highlights that, although KPI Trees can initially appear complex, they become practical and reusable tools, easily adjusted to fit unique organizational processes.

Strategic Alignment as the Ultimate Goal

Ultimately, effective KPI selection is less about having many KPIs and more about choosing the right ones. Smith’s ROKS method and KPI Trees help ensure every KPI directly supports strategic objectives, clearly linking performance measurement with organizational goals.

In conclusion, organizations need a systematic approach like the ROKS method to avoid common pitfalls in KPI selection. Aligning KPIs clearly with strategic objectives ensures clarity, relevance, and actionable insights; critical components for sustainable organizational success.

Curious how this approach could help sharpen your own KPIs?

Reach out to Alexander. He'll be happy to explore how KPI Trees and the ROKS method can bring more focus and alignment to your organization.

Artificial Intelligence is ready!

In this interview, Jonathan Aardema talks with Prof. Eric Postma (professor of Cognitive Science and Artificial Intelligence at the University of Tilburg) about the why, how, and what of artificial intelligence applications. What do we see in practice, and what does science say about it? 

Read more

Visiting London for the Tableau Partner Executive Kick-Off 2020

Every year Tableau invites its most valuable partners to kick off the new year together. The theme for this year was Accelerate, so let’s get right to the point. This exciting event was focused on three main areas.

Read more

Mastering DAX

Keeping your skills up to date is crucial when you work with the newest technology. At Rockfeather, we challenge each other to be the best version of yourself. That’s why I attended the mastering DAX course. DAX (Data Analysis Expressions) is a formula expression language. Next to Power BI, DAX is applied in Excel Power Pivot and tabular models in SQL Server. Learn it once, use it tomorrow.

Read more
All posts

Is AI-driven decision making the answer to the low adoption of Business Intelligence?

Business intelligence (BI) has long promised organizations the ability to make smarter, data-driven decisions. Yet, despite significant investments, BI adoption remains low, with only 26% of enterprises fully utilizing their analytics tools. Executives remain skeptical, with just 32% confident in their ability to make meaningful data-driven decisions, while a lack of adequate data skills has resulted in inaccurate decisions for 41% of executives.

The question, therefore, arises: can AI-powered analytics provide the breakthrough organizations need?

The Limits of Traditional BI

The persistently low adoption rates of traditional BI tools stem from several fundamental challenges:

  • Complex and unintuitive user experiences discourage widespread usage.
  • Employees frequently feel overwhelmed by the complexity of data analytics.
  • Specialized knowledge and training are necessary to leverage traditional BI effectively, creating bottlenecks.

These barriers result in missed opportunities and potentially costly business mistakes.

AI: Bridging the Gap in Data Adoption

Tools incorporating artificial intelligence, like Zebra BI, offer a new approach by directly addressing these barriers:

  • Automation of Analytical Processes: AI-driven tools can instantly generate insights from raw data, significantly reducing manual effort and time spent on data preparation.
  • Intuitive Interaction: AI enables users to engage naturally with data, posing questions in conversational language and receiving immediate, actionable insights.
  • Democratization of Data: By eliminating the need for specialized technical knowledge such as complex scripting languages (e.g., M and DAX), AI makes sophisticated data analytics accessible to a broader range of business users.

Data Quality Remains Paramount

Despite the impressive capabilities of AI, quality data remains the critical foundation for any meaningful analytics initiative. AI emphasizes and reinforces the necessity of maintaining accurate, well-organized data sets. Without high-quality input, AI outputs remain unreliable.

Key Benefits of AI-Enhanced Analytics

The incorporation of AI into data analytics offers several notable advantages:

  1. Rapid Processing Power: Quickly analyzing vast datasets to identify key trends.
  2. Enhanced Pattern Recognition: Automatically detecting and explaining data anomalies and trends.
  3. Cost Efficiency: Significantly reducing dependence on costly external consultants and specialized infrastructure.
  4. Interactive Collaboration: Facilitating dynamic interactions between users and AI agents to produce insightful and collaborative analyses.
  5. Greater Accessibility: Empowering all users, regardless of technical skill level, to meaningfully engage with data.

Making Analytics Actionable

Beyond producing insights, the real value of analytics lies in the ability to make informed, actionable decisions swiftly. AI-driven analytics tools streamline the process from data collection to actionable insights, dramatically shortening the decision-making cycle. This transforms not only individual decisions but potentially reshapes entire organizational approaches to analytics.

The Future of Business Intelligence

AI-driven analytics represents a significant evolution in business intelligence, offering organizations an opportunity to overcome persistent barriers to data adoption. Companies embracing AI tools like Zebra BI position themselves strategically to achieve higher levels of analytical maturity, though the journey invariably begins with robust, high-quality data.

As we move forward, the question is no longer whether organizations should consider AI-driven analytics, but how quickly they can adapt their processes and culture to harness its potential fully.

If you are interested in exploring the practical aspects of AI-driven analytics, the presentation at the top of this page by Andrej Lapajne offers deeper insights and valuable perspectives.

Sources: 360Suite’s Business Intelligence Survey (2020), Accenture (2020), G2 (2023), Forrester (2022), Datacamp (2023)

Interested in discussing how your organization can leverage AI-driven analytics?

Contact Alexander to explore tailored solutions and strategies for your analytics journey.

Artificial Intelligence is ready!

In this interview, Jonathan Aardema talks with Prof. Eric Postma (professor of Cognitive Science and Artificial Intelligence at the University of Tilburg) about the why, how, and what of artificial intelligence applications. What do we see in practice, and what does science say about it? 

Read more

Visiting London for the Tableau Partner Executive Kick-Off 2020

Every year Tableau invites its most valuable partners to kick off the new year together. The theme for this year was Accelerate, so let’s get right to the point. This exciting event was focused on three main areas.

Read more

Mastering DAX

Keeping your skills up to date is crucial when you work with the newest technology. At Rockfeather, we challenge each other to be the best version of yourself. That’s why I attended the mastering DAX course. DAX (Data Analysis Expressions) is a formula expression language. Next to Power BI, DAX is applied in Excel Power Pivot and tabular models in SQL Server. Learn it once, use it tomorrow.

Read more
All posts

How to Establish a Single Source of Truth?

Many companies collect data but struggle with fragmentation, inconsistent reports, and inefficient processes. The result? Data is scattered everywhere, leading to decisions based on conflicting figures. How can your organization ensure it operates with one reliable Single Source of Truth (SSOT)?

Why is a Single Source of Truth Essential?

Businesses consist of people, processes, and systems—three key components that keep an organization running. While data is captured in systems daily, it often remains siloed across different departments and tools.

The Consequences of Fragmented Data

🚨 Inconsistent Reports – Different departments rely on different numbers.
⏳ Manual Work – Significant time is wasted collecting, merging, and verifying data.
🔎 Limited Insights – Data is not fully utilized for strategic decision-making.

A Single Source of Truth eliminates these challenges, ensuring everyone works with one unified version of the truth.

How Do You Centralize Data from Operational Systems?

To extract and make data usable for analysis, a structured process with three key steps is required:

  1. Ingestion (Extract & Load) – Data is retrieved from source systems and stored centrally.
  2. Transformation (Transform) – Data is cleaned, combined, and enriched with business logic.
  3. Distribution (Serve) – Data is prepared for use in reports, dashboards, and analytics.

This process, managed by data engineering, forms the foundation of an SSOT.

How Mature is Your Organization in Data Management?

Not every company is at the same stage in becoming data-driven. There are three main maturity levels:

Level 1 – Basic (Initial Stage)

Data is primarily managed manually, often in Excel.
Little to no automation is in place.
The focus is on data collection rather than analysis or optimization.
Challenge: How to gain control over data without adding complexity?
Solution: Start with a simple database or a no-code solution like TimeXtender.

Level 2 – Growth (Scaling Stage)

The organization integrates multiple data sources and establishes a foundational data architecture.
Dashboards and automated reports are in use.
The challenge is data integration and management—ensuring seamless collaboration between systems and data sources.
Challenge: How to make data accessible and useful across the organization?
Solution: A Best-of-Breed approach with tools like Snowflake for storage and dbt for transformation.

Level 3 – Advanced (Innovation Stage)

Data is actively used for predictive analytics.
Machine learning and AI optimize business processes.
A Lakehouse architecture and real-time data processing are in place.
Challenge: How to leverage data as a competitive advantage?
Solution: A pro-code approach using platforms like Databricks and Microsoft Fabric for maximum flexibility and scalability.

💡 Where does your organization stand, and what is the next step?

Technology Choices: Selecting the Right Data Architecture

Choosing the right data architecture depends on data volume, technical expertise, and existing IT infrastructure.

Option 1: All-in-One Platform (e.g., TimeXtender)

Quick implementation with minimal technical expertise.
Ideal for businesses looking for a fast setup without a large data engineering team.
⚠️ Less flexibility and potential vendor lock-in.

Option 2: Best-of-Breed Architecture

Maximum flexibility by selecting the best tools for each component.
Combines solutions like Airbyte, Azure Data Factory, and Snowflake.
⚠️ Requires greater technical expertise and ongoing maintenance.

Option 3: Lakehouse Architecture

Ideal for organizations handling large data volumes and advanced analytics.
Data is stored in a data lake, making it cost-effective for large datasets.
⚠️ Requires a team skilled in Python and SQL.

No-Code, Low-Code & Pro-Code: How Much Control Do You Need?

A crucial decision in data engineering is the level of control and technical expertise required.

No-Code: Quick deployment without technical knowledge.
Suitable for companies without data engineers.
Tools like TimeXtender and Airbyte simplify data integration.

Low-Code: A balance between ease of use and customization.
Ideal for businesses with some data expertise.
Solutions like Azure Data Factory, dbt, and Dagster offer versatility without deep programming skills.

Pro-Code: Full control for technical teams.
For organizations with a strong data engineering team.
Platforms like Databricks, Snowflake, and Microsoft Fabric provide maximum flexibility and scalability.

What Are the Benefits of a Single Source of Truth?

  • Faster, more informed decision-making – Everyone works with the same reliable data.
  • Automated processes – Reducing errors and increasing efficiency.
  • Unlocking data value – Enabling predictive insights and strategic decision-making.
  • Building a scalable data solution – Growing with the organization.

By choosing the right architecture and technology, any organization can transition to an SSOT.

Conclusion: How to Make the Right Choice?

The ideal data architecture depends on:
🔹 Data volume – Are you dealing with gigabytes or petabytes?
🔹 Technical expertise – Do you have a team of data engineers or primarily business users?
🔹 Existing IT infrastructure – Are you working with Microsoft Azure, Google Cloud, or AWS?
🔹 Required flexibility – Do you need a quick solution or maximum control?

There is no one-size-fits-all solution, but asking the right questions will help align your data strategy with business goals.

Next Steps: What Can You Do Now?

  1. Take the Data Maturity Scan to determine your organization’s level of data maturity: Click here.
  2. Contact Alexander Mik to discuss a tailored solution for your organization.

Continue the conversation with Alexander

Data Maturity Scan

Data is everywhere, but is your organization making the most of it? Many companies invest in dashboards, BI tools, and AI but lack a structured approach to leveraging data for strategic decision-making. We believe that your solid data & automation ideas deserve to take flight. With the Data Maturity Scan, you’ll discover where your organization stands on the Data Maturity Ladder and receive a concrete action plan to take your data strategy to the next level.

Why would you do this scan?

The Data Maturity Scan is built on the DELTA Plus Model by Tom H. Davenport, a recognized framework that helps organizations assess and enhance their data maturity. By completing the scan, you will receive:

  • Insight into your data maturity – understand where your organization stands on the data maturity scale.
  • Targeted recommendations – concrete steps to strengthen data-driven decision-making.
  • Industry benchmarking – compare your data strategy against market standards.

What is the Data Maturity Ladder?

The Data Maturity Ladder helps organizations gauge their analytics maturity and growth potential. The five stages of data maturity are:

  1. Analytical Beginner – data is inconsistent, fragmented, and decisions are made based on intuition.
  2. Localized Analytics – some departments use data, but silos limit its full potential.
  3. Analytical Aspirations – centralized data storage and structured analytics processes are emerging.
  4. Analytical Companies – data and analytics are embedded in core processes and support strategic decision-making.
  5. Analytical Competitors – AI, machine learning, and automated decision-making are deployed at scale for a competitive advantage.

What does this Data Maturity Scan measure?

The scan evaluates your organization across eight critical pillars of data maturity:

  • Data Governance & Management – how reliable, accessible, and well-managed is your data?
  • Enterprise Data Strategy – to what extent is there a centralized, organization-wide approach to data and analytics?
  • Leadership & Culture – how engaged is leadership in fostering a data-driven culture?
  • Strategic Use of Analytics – is data being leveraged for strategic decision-making and business growth?
  • Analytics Skills & Talent – does your organization have the right expertise to extract value from data?
  • Technology Infrastructure – does your IT stack support advanced data analytics and AI?
  • Data Analysis Techniques – what analytical methods and models are currently being used?
  • Adoption & Integration – how well is analytics embedded into everyday processes and decision-making?

Why Is data maturity important?

According to the International Institute for Analytics (IIA), the average organization has a data maturity score of 2.2 on a 5-point scale. This indicates that many companies are still in the early stages of fully utilizing their data. Understanding where your organization stands allows you to take the right steps toward a more strategic use of analytics.

What are the benefits of increasing data maturity?

  • Faster and more informed decision-making – data is actively used in both operational and strategic processes.
  • More efficient processes – reduce inefficiencies in data integration and manual analysis.
  • Stronger competitive position – advanced analytics provide deeper insights and a strategic advantage.
  • Better AI and Machine Learning implementation – well-managed data forms the foundation for successful AI adoption.

Take the scan and strengthen your data strategy

In just a few minutes, you’ll gain clarity on where your organization stands and what steps are needed to maximize the impact of data.

Want to discuss your Data Maturity score?

Send Jonathan a message!

You might also find this interesting!

greenchoice

Data Science training as the next step in data maturity - Greenchoice

Greenchoice has already taken considerable steps with an ambitious data strategy, future-proof data architecture and a rapidly growing number of end users in the data visualisation environment. However, in order to take the next step in data maturity, Greenchoice has developed an in-company Data Science training in collaboration with Rockfeather. The training’s main objectives were: identify, develop, and implement Data Science use cases. Alex Janssen, Manager Development Consumer and Data & Analytics explains what this training has brought.

Read more

Turning Raw Data into Delicious Insights

As a well-known maker of various appetizers, Signature Foods made analyses based on various decentralized data sources inside Excel. It was then important to harmonize and visualize the data from these sources. Rockfeather helped Signature Foods implement Power BI in combination with Zebra BI to visualize information effectively, introduced the IBCS®, and workflows were automated with Power Automate.

Read more
All cases

Looking back at the Future of Data & Analytics 2023

On November 14th, 2023, the Future of Data & Analytics took place in Sparta-Stadium 'Het Kasteel' in Rotterdam. Leading organizations told us their stories on business digitalization. This page will give you access to the slides of the different sessions.

During this edition, we focused on three main areas:

  • Learning how your organization can make the most of the latest trends in digital transformation and adoption of data and analytics.
  • Discovering how you, as a financial and data & analytics professional, can make data-driven decisions through fact-based, data-driven decision-making.
  • Be inspired by several speakers who will reveal their best practices for implementing a successful data & analytics strategy.

Speakers

Keynote Speaker: Artificial Intelligence in 60 Minutes by Job van den Berg

Developments around digitalization, data, and AI are moving at lightning speed. Anyone who wants to build future-proof companies will have to innovate. In his inspiring keynote, technology connoisseur and data expert Job van den Berg showed us how to make data and AI work for us. As co-author of the book “Artificial Intelligence in 60 Minutes” and based on his years of experience as a data expert for large organizations such as DPG Media, Bluefield and Kantar, he helped us pick the low-hanging fruit.

Session: Insights in the Activity Based Costing program at Kramp

Kramp, Europe’s largest specialist in agricultural parts and accessories, talked about the Activity Based Costing program at Kramp during his session. Charles Forsskahl – Strategic Program Manager at Kramp, has explained the journey of shifting the focus from sales to profitability at the product and customer level. And how the insights were used to improve the supply chain and assortment.

Session: Insights in Trading & Forecasting using AI/ML at Greenchoice

Greenchoice is an energy company that focuses on green energy. They offer sustainable energy solutions and strive for a climate-neutral future. Maurice Koenen, Sourcing and Portfolio Director at Greenchoice took us through his story on how data brings the green energy transition closer for both consumers and the business market. He shared the positive impact and potential of using an Energy Data Platform (EDP) on daily operations. Additionally, Ties (Rockfeather) discussed the latest AI/ML techniques applied to improve and further automate Trading & Forecasting processes within Greenchoice.

Keynote speaker: data strategy at Schiphol Group

Maarten van den Outenaar, Head of Data & Analytics at Schiphol Group, took us through how Schiphol Group is setting up its data strategy 2024 – 2026. He also talked about what has been achieved in recent years and gave some practical examples of how Schiphol Group uses Data & Analytics in practice and how they deal with resistance to adoption.

By filling in the form below, you will get instant access to the slides of keynote-speaker Job van den Berg, Kramp and Greenchoice!

Sign up to get instant access to the webinar!

The slides are yours!

  • Keynote Job van den Berg: Artificial Intelligence in 60 Minutes

    Download
  • Insights in the Activity Based Costing program at Kramp

    Download
  • Insights in Trading & Forecasting using AI/ML at Greenchoice

    Download

Deep Dive Power Apps and Outsystems

In this webinar our experts compare Power Apps and Outsystems for Low-Code on four aspects. In just over 10 minutes, you will learn the difference between the two solutions.  Don't miss out and watch this Deep Dive!

Language webinar: English.

Low-code is the modern approach to developing, deploying and managing applications. Develop a new application from scratch in a fraction of the time? Or create a faster, more intelligent version of an existing application? A Low-Code Application Platform makes it possible for all developers of different experience levels to easily and quickly create custom applications for web and mobile. But which platform is best for you?

To help you make better-informed decisions, our Power Apps and Outsystems experts developed a deep-dive webinar. In this webinar our experts compare Power Apps and Outsystems on four aspects. In just over 10 minutes, you will learn the difference between the solutions focusing on:

  • Data Connection
  • Front End Development
  • Integration of Business Processes
  • Use case for both solutions

Sign up below and view the webinar instantly!

Sign up to get instant access to the webinar!