Data governance cannot be decreed, deployed, or bought off the shelf. There is no one-size-fits-all-solution.
Every organization must adopt and develop a personalized data governance strategy to succeed in its digital transformation.
DataGalaxy’s Data Knowledge Catalog platform equips your organization with the right toolset
to enable successful digital transformation through a state-of-the-art data governance plan
that works best for you.
Let’s discover your starting point and take your data maturity to the next level!
With such expertise in collaboration and acculturation, how could you miss the data documentation aspect!
You’re starting from ground level. No problem, we can work on it together!
Collaboration works quite well in your organization. However, from a data documentation perspective, not everything is clear. Don’t worry, your position is quite encouraging and will allow you to move forward quickly.
The only indicator available is the number of objects found. This indicator will probably follow a stepwise curve for technical data, with each scan of a new source bringing a considerable number of objects. For business data, the curve will look more like a logarithmic curve with a slow but constant progression.
Since you have identified and defined a lot of data, it’s important to classify it by domain. This will allow you to assign owners and see who could take care of certain properties. Implementing a matrix organization will speed up the overall data deployment, but be careful to coordinate actions with your colleagues.
With your ability to collaborate and share knowledge, it’s surprising that you’re not further along in your data asset maturity. Have you taken into account the specifics of managing and sharing data knowledge?
Attribute completion: This indicator reflects the completeness of the information. The more attributes you have filled in, the richer your data repository is.
Number of objects per attribute: For mono attributes, this indicator will allow you to understand the distribution of objects by value. You can, for example, distribute the maintenance load fairly among the different stewards.
You may also be interested in the number of your domains, which allow you to organize your objects transversively and understand the company’s level of organizational assets.
Conduct a mini audit of your data knowledge management tool. It’s important to choose a tool that is efficient for the retrieval of technical data, that allows for the enrichment of information and, above all, that is specific to data knowledge management. This can include a scalable data catalog or governance platform that allows you to reach your maturity objective without having to change it in the coming months.
With the support of your entire company, you are well on your way to achieving data governance 2.0. What remains to be done is to implement the data life cycles to enable a virtuous – and more fluid – circle of continuous updating.
The “percentage of rules followed,” those for which you have a figure at a minimum monthly frequency, allows you to validate the application of these rules and the good management of your data. This indicator will be completed with an alert system and action plan for unmet objectives.
The distribution of objects by role takes on another dimension – It allows you to apprehend the capacity of people to apply their role within the data community.
Lean on your main asset: Having federated and acculturated teams.
Prepare a few presentations on lifecycle management: How to plan and implement new data, how to obsolesce data that no longer makes sense, etc.
You are in great shape! You are likely well informed about domain driven architecture and with fully data driven business teams, you have all the assets to accomplish your data strategy. But be careful not to fall asleep: as you know, data maturity is more a state of mind than an end point.
“Delays in Status (not validated)” gives you information about the speed of your lifecycle and “Inactivity on objects” lets you know which data is dormant. For validated data, these indicators also reflect the stability of your definitions. These are also important to identify the life cycle of your items and make it a rule for managing your data catalog.
Stay informed about the latest developments in metadata management and share your experience and knowledge with your peers. Get involved in expert communities online to share your knowledge and increase your organization’s overall data literacy.
Perhaps you have real communication skills but your heritage knowledge is still completely siloed. You probably spend a lot of time exchanging dictionaries in Excel files, definition sheets in Word, or sharing links on some wiki scattered around.
You need to capitalize on your main strength: Collaboration. Identify a project team whose mission will be to define your first use case and list all the data related to it.
This project will be the first brick in the foundation, so it must be a success. Remember to involve the business users to launch a global dynamic and choose a tool that allows you to centralize the information.
Collaboration works quite well in your organization. However, from a data documentation perspective, not everything is clear. Don’t worry, your position is quite encouraging and will allow you to move forward quickly.
The only indicator available is the number of objects found.
This indicator will probably follow a stepwise curve for technical data, with each scan of a new source bringing a considerable number of objects. For business data, the curve will look more like a logarithmic curve with a slow but constant progression.
Since you have identified and defined a lot of data, it’s important to classify it by domain so that you can assign owners. You can also see who could take care of certain properties in a cross-functional way (e.g. on personal data). Implementing such a matrix organization will help speed up overall project deployment!
With your ability to collaborate and share knowledge, it’s surprising that you’re not further along in your data asset maturity. Have you taken into account the specifics of managing and sharing data knowledge?
Attribute completion: This indicator reflects the completeness of the information. The more attributes you have filled in, the richer your data repository is.
Number of objects per attribute: For mono attributes, this indicator will allow you to understand the distribution of objects by value. You can, for example, distribute the maintenance load fairly among the different stewards.
You may also be interested in the number of domains, which allow you to organize your objects transversely and understand the level of organization of your assets.
Conduct a mini audit of your data knowledge management tool. If it is a wiki, for example, there is no doubt that it represents a real obstacle in your path towards governance. Choose a tool that is efficient for the retrieval of technical data, that allows for the enrichment of information and, above all, that is specific to data knowledge management. This tool should allow you to reach your maturity objective without having to change it in the coming months.
Congratulations, you’re well on your way to data governance, but now you’ll need to consider the lifecycle of your metadata: how to make it evolve in response to business or regulatory changes as well as technical evolutions. And of course, continue to train and acculturate all employees to the proper use of data.
The “Percentage of rules followed (measured),” those for which you have a figure at a minimum monthly frequency, allows you to validate the application of these rules and the good management of your data. This indicator will of course be completed with an alert system and action plan for unmet objectives.
The distribution of objects by role takes on another dimension, as it allows you to apprehend the capacity of people to apply their role within the data community.
To reach the last step, apply the same good practices as before: Work by iterations, breakdown by business domain, implement a data mesh system for data maturity, and continue to involve all teams. Popularization and communication are essential to explain the complex concepts of data life cycle.
You are clearly positioned among the champions of data governance. However, you still need to put some oil in the wheels to make things really fluid. Fear not! Collaborative data governance is within reach!
The “Delays in Status (not validated)” indicator gives you information about the speed of your lifecycle and the “Inactivity on objects” indicator lets you know which data is dormant. For validated data, these indicators also reflect the stability of your definitions.
It will also be important to identify the life cycle of your items and make it a rule for managing your Data Knowledge Catalog. It is interesting to formalize the revision of your business terms once a year, for example.
Finally, a qualitative indicator would be the adequacy of the metamodel with your sector(s) of activity: What are the indispensable attributes that you are missing or the superfluous ones that you maintain? Have you implemented the rules specific to your business? Have you integrated your Data Catalog with your Quality Control tool?
There may be a risk that your acculturation strategy will run out of steam. Have you thought about implementing conferences or training on the added value of collaboration around data? In a large organization, also think about the notion of decentralization to scale. More and more experts are available to talk about data mesh every day!
Perhaps you have real communication skills but your heritage knowledge is still completely siloed. You probably spend a lot of time exchanging dictionaries in Excel files, definition sheets in Word, or sharing links on some wiki scattered around.
You need to capitalize on your main strength: Collaboration. Identify a project team whose mission will be to define your first use case and list all the data related to it.
This project will be the first brick in the foundation, so it must be a success. Remember to involve the business users to launch a global dynamic and choose a tool that allows you to centralize the information.
Collaboration works quite well in your organization. However, from a data documentation perspective, not everything is clear. Don’t worry, your position is quite encouraging and will allow you to move forward quickly.
The only indicator available is the number of objects found. This indicator will probably follow a stepwise curve for technical data, with each scan of a new source bringing a considerable number of objects. For business data, the curve will look more like a logarithmic curve with a slow but constant progression.
Your data repository is accessible to IT and business users, congratulations! Reaching this stage often requires a lot of work. Nevertheless, you are now likely to start having quality issues in your data and questions about how to move to the next step.
Attribute completion: This indicator reflects the completeness of the information. The more attributes you have filled in, the richer your data repository is.
Number of objects per attribute: for monovalent attributes, this indicator will allow you to understand the distribution of objects by value. You can, for example, distribute the maintenance load fairly among the different stewards.
It is time for your team to take an interest in rules management, which allows you to define policies for managing data and metadata. This also concerns the notion of interface contracts at the level of the data flows in place in your system.
Individually, some members of your data team are undeniably experts at explaining your data assets. But as soon as they have to leave, problem resolution times skyrocket, and even when they are present, you better hope they are available to respond quickly. It is urgent to change things so that these experts are no longer a bottleneck.
The “Percentage of rules followed (measured)” indicator, those for which you have a figure at a minimum monthly frequency, allows you to validate the application of these rules and the good management of your data. This indicator will of course be completed with an alert system and action plan for unmet objectives.
The distribution of objects by role takes on another dimension, as it allows you to apprehend the capacity of people to apply their role within the data community.
Agree to postpone unimportant operational actions so that your experts can transcribe their knowledge into a centralized repository that is accessible to as many people as possible. This simple action, in addition to making exchanges more fluid, will also allow your experts to be available to update their knowledge and you will thus gain in productivity on all levels.
You have succeeded in mapping most of your company’s data and have a real knowledge repository, complete and regularly updated. But what value do you get from it from a business point of view? Probably very little and the team in charge of this repository is probably perceived as being in its ivory tower.
The “Delays in Status (not validated)” indicator gives you information about the speed of your lifecycle and the “Inactivity on objects” indicator lets you know which data is dormant. For validated data, these indicators also reflect the stability of your definitions.
It will also be important to identify the life cycle of your items and make it a rule for managing your Data Knowledge Catalog. It is interesting to formalize the revision of your business terms once a year as an example.
What are the indispensable attributes that you are missing or the superfluous ones that you maintain? Have you implemented the rules specific to your business? Have you integrated your Data Knowledge Catalog with your quality control tool?
It’s all about building your data culture. Since you have a rich and coherent repository, start by ensuring that it is accessible and can be segmented to facilitate its understanding. Do you have the right tool for this?
Make presentations that show the value of this knowledge when used by business users (improved quality of decisions, reduced misunderstandings and productivity in information retrieval.
Perhaps you have real communication skills but your heritage knowledge is still completely siloed. You probably spend a lot of time exchanging dictionaries in Excel files, definition sheets in Word, or sharing links on some wiki scattered around.
You’re starting from ground level. No problem, we can work on it together.
You have a list of your data but still need to enrich it while trying to get value from what you are creating. This is most likely the time to look at how to open up your data more widely to the business.
The only indicator available is the number of objects found. This indicator will probably follow a stepwise curve for technical data, with each scan of a new source bringing a considerable number of objects. For business data, the curve will look more like a logarithmic curve with a slow but constant progression.
Your goal in a word: Scalability! Having identified and located your data, it is now essential to obtain the participation of everyone to obtain good definitions of your data. As is often the case, an iterative approach is preferable to capitalize on the enthusiasm of the teams following the value proposed by the repository you are building and therefore avoid getting bogged down in a multitude of perimeters.
Individually, some members of your data team are undeniably experts at explaining your data assets. But as soon as they have to leave, problem resolution times skyrocket, and even when they are present, you better hope they are available to respond quickly. It is urgent to change things so that these experts are no longer a bottleneck.
Attribute completion: This indicator reflects the completeness of the information. The more attributes you have filled in, the richer your data repository is.
Number of objects per attribute: For mono attributes, this indicator will allow you to understand the distribution of objects by value. You can, for example, distribute the maintenance load fairly among the different stewards.
You can also be interested in the number of Domains (Domains allow you to organize your objects transversely and allow you to know the level of organization of your assets).
Agree to postpone unimportant operational actions so that your experts can transcribe their knowledge into a centralized repository that is accessible to as many people as possible. This simple action, in addition to making exchanges more fluid, will also allow your experts to be available to update their knowledge and you will thus gain in productivity on all levels.
Individually, some members of your data team are undeniably experts at explaining your data assets. But as soon as they have to leave, problem resolution times skyrocket, and even when they are present, you better hope they are available to respond quickly. It is urgent to change things so that these experts are no longer a bottleneck.
The “Delays in Status (not validated)” gives you information about the speed of your lifecycle and the “Inactivity on objects” lets you know which data is dormant. For validated data, these indicators also reflect the stability of your definitions.
It will also be important to identify the life cycle of your items and make it a rule for managing your Data Catalog. It is interesting to formalize the revision of your business terms once a year as an example.
Finally, a qualitative indicator would be the adequacy of the metamodel with your sector(s) of activity: what are the indispensable attributes that you are missing or the superfluous ones that you maintain? Have you implemented the rules specific to your business? Have you integrated your Data Catalog with your Quality Control tool?
It’s all about building your data culture. Since you have a rich and coherent repository, start by ensuring that it is accessible and can be segmented to facilitate its understanding. Do you have the right tool for this?
Make presentations that show the value of this knowledge when used by business users (improved quality of decisions, reduced misunderstandings and productivity in information retrieval.
You have succeeded in mapping most of your company’s data and have a real knowledge repository, complete and regularly updated. But what value do you get from it from a business point of view? Probably very little and the team in charge of this repository is probably perceived as being in its ivory tower.
The “Percentage of rules followed (measured)” – i.e. for which you have a figure at a minimum monthly frequency – allows you to validate the application of these rules and the good management of your data. This indicator will of course be completed with an alert system and action plan for unmet objectives.
The distribution of objects by role takes on another dimension, since it allows you to apprehend the capacity of people to apply their role within the data community.
Agree to postpone unimportant operational actions so that your experts can transcribe their knowledge into a centralized repository that is accessible to as many people as possible. This simple action, in addition to making exchanges more fluid, will also allow your experts to be available to update their knowledge and you will thus gain in productivity on all levels.
Understandably, it’s disappointing to be at the very beginning of the data maturity path. The good news is that you are aware of your starting point! With no documentation for your data, and no real organization in this project, you have the advantage of starting with a blank canvas.
None – just worry about getting to level 2 as soon as possible.
Identify someone with data domain knowledge and the desire to share, then give them time to list the data and use cases (name and technical location) and model it (the links between it). It doesn’t sound like much, but it will allow you to have a first iteration to show others how simple the process is, and you will progress quickly on the 2 axes of maturity.
Start sharing your strategy and early results to build traction and move into the inventory phase.
There is no doubt about your ability to document your data or your level of knowledge about it. However, what is the value of this documentation if it is not accessible and understandable to others? You must put collaboration and sharing back at the center of your concerns.
The only indicator available is the number of objects found.
This indicator will probably follow a stepwise curve for technical data, with each scan of a new source bringing a considerable number of objects. For business data, the curve will look more like a logarithmic curve with a slow but constant progression.
If there is one problem that can be easily solved, it knowledge sharing. All of your documentation could probably be made available to everyone with a few clicks. Ease of exploration (or search) and clarity of information are potentially more complicated to achieve. We recommend the implementation of a dedicated tool such as a Data Catalog to allow everyone to better exchange knowledge about the data.
There is no doubt about your ability to document your data or your level of knowledge about it. However, what is the value of this documentation if it is not accessible and understandable to others? You must put collaboration and sharing back at the center of your concerns.
Attribute completion: This indicator reflects the completeness of the information. The more attributes you have filled in, the richer your data repository is.
Number of objects per attribute: for monovalent attributes, this indicator will allow you to understand the distribution of objects by value. You can, for example, distribute the maintenance load fairly among the different stewards.
You can also be interested in the number of Domains (Domains allow you to organize your objects in a transverse way and allow you to know the level of organization of your assets).
progression.
If there is one problem that can be easily solved, it knowledge sharing. All of your documentation could probably be made available to everyone with a few clicks. Ease of exploration (or search) and clarity of information are potentially more complicated to achieve. We recommend the implementation of a dedicated tool such as a Data Catalog to allow everyone to better exchange knowledge about the data.
There is no doubt about your ability to document your data or your level of knowledge about it. However, what is the value of this documentation if it is not accessible and understandable to others? You must put collaboration and sharing back at the center of your concerns.
The “Percentage of rules followed (measured)” – i.e. for which you have a figure at a minimum monthly frequency – allows you to validate the application of these rules and the good management of your data. This indicator will of course be completed with an alert system and action plan for unmet objectives.
The distribution of objects by role takes on another dimension, since it allows you to apprehend the capacity of people to apply their role within the data community.
If there is one problem that can be easily solved, it knowledge sharing. All of your documentation could probably be made available to everyone with a few clicks. Ease of exploration (or search) and clarity of information are potentially more complicated to achieve. We recommend the implementation of a dedicated tool such as a Data Catalog to allow everyone to better exchange knowledge about the data.
Especially if you are in a large company, this position is surprising to say the least! How can you control your data assets so well without a minimum of sharing and collaboration? In short, you’re playing a joke on us! No luck, we already know it.
The “Delays in Status (not validated)” gives you information about the speed of your lifecycle and the “Inactivity on objects” lets you know which data is dormant. For validated data, these indicators also reflect the stability of your definitions.
It will also be important to identify the life cycle of your items and make it a rule for managing your Data Catalog. It is interesting to formalize the revision of your business terms once a year as an example.
Finally, a qualitative indicator would be the adequacy of the metamodel with your sector(s) of activity: what are the indispensable attributes that you are missing or the superfluous ones that you maintain? Have you implemented the rules specific to your business? Have you integrated your Data Catalog with your Quality Control tool?
Feel free to check another box to see if we can help you advance your data maturity!
Not only does data governance improve overall data quality, it also has a significant impact on your company’s overall competitiveness and decision-making cabilities. Data governance allows users to better manage current challenges and gives access to new business opportunities that would not possible with data of low quality.
Successful data governance programs are attainable through a well-executed, acionable plan: Organizations can overcome the challenges that change inevitably presents by starting small and gradually expanding the program.