How to Establish a Single Source of Truth?

Many companies collect data but struggle with fragmentation, inconsistent reports, and inefficient processes. The result? Data is scattered everywhere, leading to decisions based on conflicting figures. How can your organization ensure it operates with one reliable Single Source of Truth (SSOT)?

Why is a Single Source of Truth Essential?

Businesses consist of people, processes, and systems—three key components that keep an organization running. While data is captured in systems daily, it often remains siloed across different departments and tools.

The Consequences of Fragmented Data

🚨 Inconsistent Reports – Different departments rely on different numbers.
⏳ Manual Work – Significant time is wasted collecting, merging, and verifying data.
🔎 Limited Insights – Data is not fully utilized for strategic decision-making.

A Single Source of Truth eliminates these challenges, ensuring everyone works with one unified version of the truth.

How Do You Centralize Data from Operational Systems?

To extract and make data usable for analysis, a structured process with three key steps is required:

  1. Ingestion (Extract & Load) – Data is retrieved from source systems and stored centrally.
  2. Transformation (Transform) – Data is cleaned, combined, and enriched with business logic.
  3. Distribution (Serve) – Data is prepared for use in reports, dashboards, and analytics.

This process, managed by data engineering, forms the foundation of an SSOT.

How Mature is Your Organization in Data Management?

Not every company is at the same stage in becoming data-driven. There are three main maturity levels:

Level 1 – Basic (Initial Stage)

Data is primarily managed manually, often in Excel.
Little to no automation is in place.
The focus is on data collection rather than analysis or optimization.
Challenge: How to gain control over data without adding complexity?
Solution: Start with a simple database or a no-code solution like TimeXtender.

Level 2 – Growth (Scaling Stage)

The organization integrates multiple data sources and establishes a foundational data architecture.
Dashboards and automated reports are in use.
The challenge is data integration and management—ensuring seamless collaboration between systems and data sources.
Challenge: How to make data accessible and useful across the organization?
Solution: A Best-of-Breed approach with tools like Snowflake for storage and dbt for transformation.

Level 3 – Advanced (Innovation Stage)

Data is actively used for predictive analytics.
Machine learning and AI optimize business processes.
A Lakehouse architecture and real-time data processing are in place.
Challenge: How to leverage data as a competitive advantage?
Solution: A pro-code approach using platforms like Databricks and Microsoft Fabric for maximum flexibility and scalability.

đź’ˇ Where does your organization stand, and what is the next step?

Technology Choices: Selecting the Right Data Architecture

Choosing the right data architecture depends on data volume, technical expertise, and existing IT infrastructure.

Option 1: All-in-One Platform (e.g., TimeXtender)

Quick implementation with minimal technical expertise.
Ideal for businesses looking for a fast setup without a large data engineering team.
⚠️ Less flexibility and potential vendor lock-in.

Option 2: Best-of-Breed Architecture

Maximum flexibility by selecting the best tools for each component.
Combines solutions like Airbyte, Azure Data Factory, and Snowflake.
⚠️ Requires greater technical expertise and ongoing maintenance.

Option 3: Lakehouse Architecture

Ideal for organizations handling large data volumes and advanced analytics.
Data is stored in a data lake, making it cost-effective for large datasets.
⚠️ Requires a team skilled in Python and SQL.

No-Code, Low-Code & Pro-Code: How Much Control Do You Need?

A crucial decision in data engineering is the level of control and technical expertise required.

No-Code: Quick deployment without technical knowledge.
Suitable for companies without data engineers.
Tools like TimeXtender and Airbyte simplify data integration.

Low-Code: A balance between ease of use and customization.
Ideal for businesses with some data expertise.
Solutions like Azure Data Factory, dbt, and Dagster offer versatility without deep programming skills.

Pro-Code: Full control for technical teams.
For organizations with a strong data engineering team.
Platforms like Databricks, Snowflake, and Microsoft Fabric provide maximum flexibility and scalability.

What Are the Benefits of a Single Source of Truth?

  • Faster, more informed decision-making – Everyone works with the same reliable data.
  • Automated processes – Reducing errors and increasing efficiency.
  • Unlocking data value – Enabling predictive insights and strategic decision-making.
  • Building a scalable data solution – Growing with the organization.

By choosing the right architecture and technology, any organization can transition to an SSOT.

Conclusion: How to Make the Right Choice?

The ideal data architecture depends on:
🔹 Data volume – Are you dealing with gigabytes or petabytes?
🔹 Technical expertise – Do you have a team of data engineers or primarily business users?
🔹 Existing IT infrastructure – Are you working with Microsoft Azure, Google Cloud, or AWS?
🔹 Required flexibility – Do you need a quick solution or maximum control?

There is no one-size-fits-all solution, but asking the right questions will help align your data strategy with business goals.

Next Steps: What Can You Do Now?

  1. Take the Data Maturity Scan to determine your organization’s level of data maturity: Click here.
  2. Join the Data Automation Pitstop on May 15, 2025, to explore key aspects of data and automation strategy: Register here.
  3. Contact Alexander Mik to discuss a tailored solution for your organization.

Continue the conversation with Alexander

Data Maturity Scan

Data is everywhere, but is your organization making the most of it? Many companies invest in dashboards, BI tools, and AI but lack a structured approach to leveraging data for strategic decision-making. We believe that your solid data & automation ideas deserve to take flight. With the Data Maturity Scan, you’ll discover where your organization stands on the Data Maturity Ladder and receive a concrete action plan to take your data strategy to the next level.

Why would you do this scan?

The Data Maturity Scan is built on the DELTA Plus Model by Tom H. Davenport, a recognized framework that helps organizations assess and enhance their data maturity. By completing the scan, you will receive:

  • Insight into your data maturity – understand where your organization stands on the data maturity scale.
  • Targeted recommendations – concrete steps to strengthen data-driven decision-making.
  • Industry benchmarking – compare your data strategy against market standards.

What is the Data Maturity Ladder?

The Data Maturity Ladder helps organizations gauge their analytics maturity and growth potential. The five stages of data maturity are:

  1. Analytical Beginner – data is inconsistent, fragmented, and decisions are made based on intuition.
  2. Localized Analytics – some departments use data, but silos limit its full potential.
  3. Analytical Aspirations – centralized data storage and structured analytics processes are emerging.
  4. Analytical Companies – data and analytics are embedded in core processes and support strategic decision-making.
  5. Analytical Competitors – AI, machine learning, and automated decision-making are deployed at scale for a competitive advantage.

What does this Data Maturity Scan measure?

The scan evaluates your organization across eight critical pillars of data maturity:

  • Data Governance & Management – how reliable, accessible, and well-managed is your data?
  • Enterprise Data Strategy – to what extent is there a centralized, organization-wide approach to data and analytics?
  • Leadership & Culture – how engaged is leadership in fostering a data-driven culture?
  • Strategic Use of Analytics – is data being leveraged for strategic decision-making and business growth?
  • Analytics Skills & Talent – does your organization have the right expertise to extract value from data?
  • Technology Infrastructure – does your IT stack support advanced data analytics and AI?
  • Data Analysis Techniques – what analytical methods and models are currently being used?
  • Adoption & Integration – how well is analytics embedded into everyday processes and decision-making?

Why Is data maturity important?

According to the International Institute for Analytics (IIA), the average organization has a data maturity score of 2.2 on a 5-point scale. This indicates that many companies are still in the early stages of fully utilizing their data. Understanding where your organization stands allows you to take the right steps toward a more strategic use of analytics.

What are the benefits of increasing data maturity?

  • Faster and more informed decision-making – data is actively used in both operational and strategic processes.
  • More efficient processes – reduce inefficiencies in data integration and manual analysis.
  • Stronger competitive position – advanced analytics provide deeper insights and a strategic advantage.
  • Better AI and Machine Learning implementation – well-managed data forms the foundation for successful AI adoption.

Take the scan and strengthen your data strategy

In just a few minutes, you’ll gain clarity on where your organization stands and what steps are needed to maximize the impact of data.

Want to discuss your Data Maturity score?

Send Jonathan a message!

You might also find this interesting!

greenchoice

Data Science training as the next step in data maturity - Greenchoice

Greenchoice has already taken considerable steps with an ambitious data strategy, future-proof data architecture and a rapidly growing number of end users in the data visualisation environment. However, in order to take the next step in data maturity, Greenchoice has developed an in-company Data Science training in collaboration with Rockfeather. The training’s main objectives were: identify, develop, and implement Data Science use cases. Alex Janssen, Manager Development Consumer and Data & Analytics explains what this training has brought.

Read more

Turning Raw Data into Delicious Insights

As a well-known maker of various appetizers, Signature Foods made analyses based on various decentralized data sources inside Excel. It was then important to harmonize and visualize the data from these sources. Rockfeather helped Signature Foods implement Power BI in combination with Zebra BI to visualize information effectively, introduced the IBCS®, and workflows were automated with Power Automate.

Read more
All cases

Looking back at the Future of Data & Analytics 2023

On November 14th, 2023, the Future of Data & Analytics took place in Sparta-Stadium 'Het Kasteel' in Rotterdam. Leading organizations told us their stories on business digitalization. This page will give you access to the slides of the different sessions.

During this edition, we focused on three main areas:

  • Learning how your organization can make the most of the latest trends in digital transformation and adoption of data and analytics.
  • Discovering how you, as a financial and data & analytics professional, can make data-driven decisions through fact-based, data-driven decision-making.
  • Be inspired by several speakers who will reveal their best practices for implementing a successful data & analytics strategy.

Speakers

Keynote Speaker: Artificial Intelligence in 60 Minutes by Job van den Berg

Developments around digitalization, data, and AI are moving at lightning speed. Anyone who wants to build future-proof companies will have to innovate. In his inspiring keynote, technology connoisseur and data expert Job van den Berg showed us how to make data and AI work for us. As co-author of the book “Artificial Intelligence in 60 Minutes” and based on his years of experience as a data expert for large organizations such as DPG Media, Bluefield and Kantar, he helped us pick the low-hanging fruit.

Session: Insights in the Activity Based Costing program at Kramp

Kramp, Europe’s largest specialist in agricultural parts and accessories, talked about the Activity Based Costing program at Kramp during his session. Charles Forsskahl – Strategic Program Manager at Kramp, has explained the journey of shifting the focus from sales to profitability at the product and customer level. And how the insights were used to improve the supply chain and assortment.

Session: Insights in Trading & Forecasting using AI/ML at Greenchoice

Greenchoice is an energy company that focuses on green energy. They offer sustainable energy solutions and strive for a climate-neutral future. Maurice Koenen, Sourcing and Portfolio Director at Greenchoice took us through his story on how data brings the green energy transition closer for both consumers and the business market. He shared the positive impact and potential of using an Energy Data Platform (EDP) on daily operations. Additionally, Ties (Rockfeather) discussed the latest AI/ML techniques applied to improve and further automate Trading & Forecasting processes within Greenchoice.

Keynote speaker: data strategy at Schiphol Group

Maarten van den Outenaar, Head of Data & Analytics at Schiphol Group, took us through how Schiphol Group is setting up its data strategy 2024 – 2026. He also talked about what has been achieved in recent years and gave some practical examples of how Schiphol Group uses Data & Analytics in practice and how they deal with resistance to adoption.

Interested in the slides of these sessions?

By filling in the form below, you will get instant access to the slides of keynote-speaker Job van den Berg, Kramp and Greenchoice!

The slides are yours!

  • Keynote Job van den Berg: Artificial Intelligence in 60 Minutes

    Download
  • Insights in the Activity Based Costing program at Kramp

    Download
  • Insights in Trading & Forecasting using AI/ML at Greenchoice

    Download

Deep Dive Power Apps and Outsystems

In this webinar our experts compare Power Apps and Outsystems for Low-Code on four aspects. In just over 10 minutes, you will learn the difference between the two solutions.  Don't miss out and watch this Deep Dive!

Language webinar: English.

Low-code is the modern approach to developing, deploying and managing applications. Develop a new application from scratch in a fraction of the time? Or create a faster, more intelligent version of an existing application? A Low-Code Application Platform makes it possible for all developers of different experience levels to easily and quickly create custom applications for web and mobile. But which platform is best for you?

To help you make better-informed decisions, our Power Apps and Outsystems experts developed a deep-dive webinar. In this webinar our experts compare Power Apps and Outsystems on four aspects. In just over 10 minutes, you will learn the difference between the solutions focusing on:

  • Data Connection
  • Front End Development
  • Integration of Business Processes
  • Use case for both solutions

Sign up below and view the webinar instantly!

Watch the webinar!

Looking back: Data & Analytics Line Up 2022

Want to know what's on sale for dashboarding or data integration solutions? Want to compare data science solutions? Or would you like to see Low Coding platforms in action? This and more was discussed at the Data & Analytics Line Up 2022!

During this event Steven Koppenol, Manager IT & Data Analytics at Visma | Raet told his story:
How does Visma create business value through the smart use of data analytics and automation? What choices has Visma Raet made in its landscape, what have they learned from this and how will this develop in the future?

Keynote speaker Bas Nijhuis, top Eredivisie soccer referee took us through his great love for soccer and what goes on in the life of a top referee. In addition, Bas made connections between the world of top sports and that of business. The use of technology and innovation in making choices is a nice parallel between the world of top sports and our theme for this day.

Furthermore, this day was all about comparing the best Data & Analytics solutions. We are happy to share with you the presentations that were given.

We will take you through solutions like Tray.io, TimeXtender, Alteryx, Azure Machine Learning, Auto ML (Pycaret), Microsoft Power BI, Tableau and SAP Analytics Cloud, Microsoft Power Apps & Automate and Outsystems via the form below.

Would you also like to see the best Data & Analytics solutions side by side?

The slides below are interesting for anyone who wants to innovate but is not yet sure which tools are needed to do so or wants to experience and compare the capabilities of other solutions.

Tell us which slides you want to see and download them instantly!

Download your slides now!

Got inspired by the slides, or want to be invited for the next Data & Analytics Line Up? Contact Frank via frank.boudestijn@rockfeather.com!

Become a better admin with the Power BI Admin Dashboard

• Monitor your Power BI tenant
• Get an overview of report updates
• Gain insights of report usage

What is the Power BI Admin dashboard?

The Power BI Admin dashboard is a monitoring tool for people overseeing Power BI users. The dashboard gives you the ability to monitor everything that is going on in your organisation’s workspaces at once, all while making an accessible overview in a Power BI report. By using the data collected from the Office 365/Azure portal, a Power BI Admin can create a report using user statistics and log files.

What can it do?

  • Monitor Power BI tenant, across workspaces instead of focusing on one workspace
  • Monitor which Data Sources are being used on Dashboards
  • Know whether you’re overpaying for your Power BI licenses
  • Get insights into by whom/when/how your reports are being used
  • Know when there are Datasets refreshing errors and know the distribution of refresch times across the tenant
  • Get insights into how much a report is being used
  • Create a quick overview of user access for each workspace
  • See how many reports, dashboards, and datasets have been made

How is it different than Power BI Report Usage?

While Power BI Report Usage can only be used on a Workspace Level, the Power BI Admin Dashboard gives you full control of data on both a workspace, as well as, on a tenant level. Simply put, it gives you more insight into your organization. Additionally, the dashboard allows you to create clear visuals from table data, something you are unable to do in Power BI Report Usage. Furthermore, this dashboard gives you full insights on refresh schedules and statuses, giving you the average duration of a refresh and a detailed report about the reason why it failed. Lastly, the Power BI Admin Dashboard lets you analyse Activities from the moment you started using the dashboard, instead of only the last 30 days like in Power BI Report Usage.

What is a good use case of Power BI Admin Dashboard?

Every day at 9 in the morning you refresh a financial report and monitor its usage by members of your team throughout the day. By analyzing this, you notice two distinct peaks in activity, one at 10 in the morning and one at 9 in the evening. While the views at 10 in the morning is within normal working hours, the views at 9 in the evening certainly are not. Using the cross-filter functionality in the Power BI Admin Dashboard, you are able to check which team members are checking the financial report at which time of the day. From this view you also notice that certain team members have not been able to view the new data and you can help them out by fixing the issue. Essentially, the Power BI Admin Dashboard gives you the necessary insights for your team to perform at the top of their game.

Advantages of Power BI Admin:

  1. Can be managed on a tenant level
  2. Find clear connections between data to improve usage of the dashboards
  3. Shows historical data, helps to see trends
  4. See which users are using your reports and which users are not yet using it but do have access
  5. Create more insights from log files/user statistics

The tools that were used to create the dashboard:

  1. Graph APIfrom Azure
  2. Power BI rest API
  3. Logic Apps in Azure
  4. Azure SQL Server
  5. Power BI

Tools that we decided not to use:

While the above tools can result in a similar dashboard, the reason we choose to work with Logic Apps is because:

  • Logic Apps is better suited for extracting small datasets with up to 100 rows of data. Since we do API calls that generate these smaller datasets, in our case we do an API call for each Power BI dataset in the tenant. Therefore, using Logic Apps is more cost efficient compared to other tools like Azure Data Factory that are made for transferring larger sets of data.
  • Logic Apps is a low code tool, meaning that it is both easier to read and use for citizen developers compared to writing code for other tools, like Azure Functions. Additionally, understanding the logic, flow of data, and debugging/troubleshooting is easier in Logic Apps.

You can achieve the same goal of extracting data with almost every ETL tool, however Rockfeather works with Azure Cloud because of its ease of use and native integration with Power BI.

Rockfeather & The Power BI Admin dashboard

While working with customers we’ve noticed that Power BI admins were not happy about limitations in monitoring their Power BI tenant. The Power BI Admin dashboard gives Power BI admins the options to go around these limitations. First, we’ve been able to give Power BI admins the option of analyzing data from an extended period of time. Secondly, by showing non-essential licenses, we’ve been able to help Power BI admins cut down on costs. Lastly, by being able to get an overview of who was able to access reports that they shouldn’t have access to, we could improve on data security.

Within Rockfeather, we use Power BI as our BI-tool and we have several different reports published on our tenant. Using the capabilities of the Power BI Admin Dashboard we are able to check whether our data was refreshed successfully, look at the activity on different reports, and check which of our dashboards are used most often. This way, we don’t have the hassle of checking five different dashboards throughout the day, we are now able to spot which reports are most important for our team members, and we can see how our dashboards are being used over a longer period of time.

An architectural overview of the Power BI Admin Dashboard can be found below:

the archetecural structure of a Power BI Admin Dashboard

Or if you want to try our anonymized Power BI Admin Dashboard yourself and see which cross insights we created? You can click around in our published report:

If you want to know more about the implementation of this, please reach out to us and we’ll help you with the implementation process. We use templates and/or solutions that are easy to deploy.

How to Establish a Single Source of Truth?

Many companies collect data but struggle with fragmentation, inconsistent reports, and inefficient processes. The result? Data is scattered everywhere, leading to decisions based on conflicting figures.

Read more

Webinar: How to Choose a Future-Proof Data Platform for Your Organization?

How do you ensure that your organization always has access to the right data—without wasting time on manual processes and scattered systems? In this webinar, you’ll learn how to build a scalable and reliable data platform that fits the current maturity level of your organization.

Read more

Data Maturity Scan

With the Data Maturity Scan, you’ll discover where your organization stands on the Data Maturity Ladder and receive a concrete action plan to elevate your data strategy.

Read more
All posts

The Future of Data & Analytics 2022

The Future of Data & Analytics 2022 took place on June 14. During this yearly event, passionate speakers inspired us by shedding a light on how they implemented a successful data & analytics strategy.

What were the key take aways from the event?

The three focus areas of this edition were:

  • Learning how your organisation can use the most recent trends in terms of digital transformation and the adoption of data and analysis.
  • Discover how you can maximize your marketing efforts through the power of data driven marketing, completely based on facts.
  • Become inspired by the best practices of different speakers’ that share their stories on how to successfully implement a data & analytics strategy.

Keynote speakers

Deborah Nas, keynote speaker, author and C-level sparring partner in the field of technology and innovation.

Deborah knows how to entral her audience with her inspiring presentation filled to the brim with surprising and relatable examples. During her presentation, she places technological developments in both a societal, as well as a business context. This gives the audience an understanding of what these developments mean in the here and now, but also in the future.

On top of that, is Deborah the author of Tech Innovator’s Guide “Design Things That Make Sense”, a manual for innovators that want to maximize the chance of success for their innovations and implement technology in a useful way

Tom Coronel, Speaker, Entrepreneur & FIA WTCR en Dakar Rally driver
Tom has many years of experience in international motorsport and won races in multiple disciplines in the sport. Additionally he runs multiple very successfull webshops.

During the event, Tom shared his experiences with innovation and having to trust technology in motorsport. Moreover he talked about the importance of working as a team in any discipline.

Organizations in practice:

Feyenoord – Edwin Suk, CIO – “A datadriven football organisation”

Edwin Suk, CIO of Feyenoord, talked us through how a professional football club uses data & insights in order to manage the organisation, both internally and as a  football organisation.

Stedin – Sophie de Kok, Lead Digital BI lab – “How does Stedin use technology to manage a complex business?”

Stedin uses different technologies on both the financial side and non-financial side of the business in order to make managing Stedin more insightful. Sophie shared how the best solutions in the field of Data Visualisation and Low Code help Stedin’s different departments to work more closely together.

Visma – Tom van Dael, CFO – “The digital transformation strategy at Visma | Raet”

Tom van Dael, as the CFO of Visma | Raet, oversees how Visma puts their digital transformation into practice. During his talk, Tom talked about the different steps Visma | Raet had taken during the process and what the company achieved as a result of these steps.

CB – Arjan de Jong, Manager Logistics Services – “Predict the demand for production planning”

CB as a large logistical business needs to continuously be able to predict the demand of books. The better CB can anticipate on the demand, all the better they would be able to match this demand in their storage facility. In his talk, Arjan spoke about the Data Science techniques that predict the demand for CB and how they can increase their surface level while lowering their costs.

Alexander Mik, Data Science Consultant Rockfeather, and Robbin van Wijk, EPM Consultant Finext – “De waarde van advanced forecasting modellen binnen Auping.”

There have been a lot of developments in the world of forecasting. For example, during the fourth Makridakis Competition in 2020, it was determined again that complex statistical models (machine learning and neural networks) perform better than more traditonal forecasting methodes. During their talk, Alexander and Robbin showed how  advanced forecasting models went up against traditional forecasting methods at Auping.

Interested in more?

The presentations from Sophie de Kok from Stedin, and Arjan de Jong from CB are watchable if you fill out this form below.

Are you interested in knowing more about the content of the presentations? Or are you inspired and want to know how you can use the latest trends on digital transformation and data analytics for your organization? Reach out to Paul Damen!

Take a look at the presentations!

Take a look at the presentations!

  • Presentation of Sofie de Kok: Stedin @ The Future of Data & Analytics 2022

    Download
  • Presentation of Arjan de Jong - CB @ The Future of Data & Analytics 2022

    Download

Data Science Summer School

Rockfeather organiseert de Data Science Summer School. Op drie achtereenvolgende vrijdagen (11, 18 en 25 augustus) maak je kennis met diverse Data Sciences methodes en technieken en leer je ze toepassen.

Meer weten?
All events
tableau-test-drive

Online Workshop Tableau

In deze gratis workshop leren we jou in anderhalf uur hoe jij data kunt laden en visualiseren met Tableau. 
Vul onderaan deze pagina je gegevens in en bekijk de workshop direct!

Ga aan de slag met Tableau in onze gratis online workshop. Leer in één middag hoe Tableau jou kan helpen om processen te optimaliseren met waardevolle analyses door gebruik te maken van jouw data.

In deze gratis workshop leer je hoe jij jouw data kunt laden en visualiseren met Tableau. De interactieve dashboards helpen je beter “data-gedreven” beslissingen te nemen en effectief te communiceren binnen jouw organisatie. De workshop is bedoeld voor nieuwe Tableau-gebruikers, maar ook als je al eerder met Tableau hebt gewerkt, kan het nuttig zijn om aan deze workshop deel te nemen.

De workshop richt zich op de volgende twee vragen:

  • Hoe kun je eenvoudig meerdere gegevensbronnen combineren, visualiseren en communiceren?
  • Hoe krijg je snel waardevolle inzichten uit deze data met behulp van Tableau?

Vul hieronder je gegevens in en bekijk direct de online workshop Tableau!

Bekijk de workshop!

rerun-webinar-churn

Online Workshop Power BI

In this online workshop, you will learn about several key features of the Power BI service. This introductory course is designed to teach you step-by-step how to create Power BI Desktop reports. Fill in your details to view this session!

Language workshop: Dutch.

Power BI is echt een onmisbare tool geworden. Talloze bedrijven gebruiken deze visualisatieoplossing uit de Microsoft-stack. Maak gebruik van deze mooie gelegenheid om Power BI te leren kennen met onze gratis online sessie. In slechts een paar uur tijd leer je hands-on wat deze oplossing voor jou kan betekenen. Je hebt de ruimte om te leren van experts en meteen aan de slag te gaan.

Tijdens de workshop laten wij je de belangrijke functies van de Power BI-service kennen. De Test Drive is een inleidende cursus die bedoeld is om jou stapsgewijs Power BI Desktop-rapporten te leren maken. Het doel hiervan is dat jij na afloop voldoende kennis hebt opgedaan en over de volgende skills bezit:

  • Gegevens laden vanuit Microsoft Excel
  • Gegevens te manipuleren om ze voor te bereiden voor rapportage
  • Een rapport met verschillende visuals maken

Bekijk de workshop!