Article originally posted on Utility Analytics Institute.
What we see with our clients are that they want to gain more business value from their data. However, the initial direction they take make differ. Some clients focus on technologies while others focus on building their data model. Regardless of how you want to start, you will need to think about your
The data a utility generates and needs access to will only continue to grow as opportunities are presented and we are faced with solving complex operational and environmental challenges. This will require not only the data you generate but also data from other external sources. At Xtensible, we continue to work with utilities to drive data that is based on leading edge analytics and operations management. For example, utilizing internal network data— combined with external data sources to generate wildfire analytics to determine scenarios in which to de-energize lines, to asset or customer failures prevention—requires planning and foresight. The amount of data and how it is generated and gathered has resulted in changes to data architecture approaches to manage, integrate, govern, and provide access, while securing, data across multiple domains, technologies and deployment methods.
It appears, though, that organizations are building more data silos today, not fewer, and with the growth of cloud computing, the problems are becoming bigger than ever.
Data Mesh and Data Fabric
In the data governance and management domain, data fabric and data mesh are two of the “buzzword” architecture concepts that are frequently mentioned. It is important to realize that your data architecture needs to take into consideration your current and future needs, so be flexible. Your data architecture directly effects enterprise analytics through how it’s managed and how it’s being served. Well-defined and executed data architecture allows for quicker time to market for the high-quality analytics that are needed.
What is a Data Fabric Architecture?
A data fabric takes a metadata driven approach to connecting data across multiple sources whether the data is located on premises, in the cloud, or at the edge of a network. It is a unified management approach and overlay on top of various data repositories. Essentially, data fabrics are woven with application programming interfaces, common data-interchange formats and management policies that address specific types of data. Some may consider this a top-down approach. However, data owners, stewards, designers, analysts and data scientists can have self-serve capabilities to accomplish their job functions. Systems can participate by simply sharing their metadata. The insights from the data fabric come from continuous learning, evaluation and expansion of the metadata. Active metadata management is therefore essential.
What is a Data Mesh Architecture?
Typically, the data mesh takes the source system as the authority, which results in the data taking on more of a use case specific process and context. A data fabric seeks to build a single layer governance layer atop distributed data. In contrast, the data mesh encourages distributed groups of teams to manage data as they see fit, with some common governance and semantic data definition that can be shared. Some would consider this more of a bottom-up approach. Much of the information for the data mesh comes from Subject Matter Expert’s (SME’s) based on valuable operational experience.
Need for Semantic Modeling
Semantic modeling is used to depict relationships that exist among specific data and establishes a common definition of the data. A recent article by Gartner Leverage semantics to Drive Business Value From Data advised data and analytics leaders to utilize a semantic approach to their enterprise data. If you don’t, you will be faced with the endless challenge of data silos.
Whether your approach is based on the concepts of data fabric, data mesh or a combination of both, semantic modeling is important because it speaks to data and analytics governance. In the end, data must be understood and trusted. Semantic modeling allows you to exploit linked data; it also leverages a common definition that everyone across the organizations agrees upon. It is wise to involve SMEs in its creation. One of our clients recently posted a two-part article on the Importance of Data Modeling in Analytics and Beyond. The article examines the value of taking a semantic approach, that not only supported the initial use cases related to DER, but how the semantic approach has continued to expand to include Asset Management, Work Management, Network Connectivity— reducing integration cost and for added value as the business evolves.
The Xtensible Enterprise Semantic Modeling and Metadata Management Approach
For over two decades, the Xtensible team has been working with and supporting utilities to take a semantic-based approach to managing their data, utilizing the IEC CIM standard and other reference models.
“Based on our experience, the IEC CIM simplifies and supports utilities when capturing new data, for more analytics and intelligent automation, and for outcomes benefiting all stakeholders.”
With Affirma, our Enterprise Semantic and Metadata Management Solution, we are taking our proven semantic modeling approach, and using the IEC CIM as a reference model to help build your enterprise semantic model for analytics. Affirma offers a unique capability for utilities to not only establish a common definition of data—for reusability and delivery of data to support data-in-motion and data-at-rest—but also brings together disparate technologies for data mapping, lineage, integration, profiling and more within a single solution supporting your chosen data architecture.
Interested in learning more about data architecture, semantic modeling and how to navigate change? Speak to a member of the Xtensible team.