"*" indicates required fields
Step 1 of 19
Welcome to Utilligent’s Data & Analytics capability self-assessment. You will be asked to consider the maturity levels at your organization for the 16 data and analytics capabilities highlighted on the outside of the wheel on the left. For each capability, we will provide you some information about it and you will be asked to indicate your level of agreement with statements describing it. It will take you about 30-40 minutes to complete the survey; please provide additional comments where you can.
Once completed, we will carefully review all responses and compile a consolidated report including perspectives of where your organization is on the data maturity journey.
We'll use this for future communications and to send your customized results back to you.
An enterprise data strategy is a business plan for leveraging an enterprise’s data assets to maximum advantage; it sets the direction and goals and defines an approach to solving the key data challenges facing the enterprise.
A data strategy should include business plans to use information to competitive advantage and support enterprise goals. Data strategy must come from an understanding of the data needs inherent in the business strategy: what data the organization needs, how it will get the data, how it will manage it and ensure its reliability over time, and how it will utilize it. Typically, a data strategy requires a supporting Data Management program strategy – a plan for maintaining and improving the quality of data, data integrity, access, and security while mitigating known and implied risks. The strategy must also address known challenges related to data management.
In many organizations, the data management strategy is owned and maintained by the Chief Data Officer (CDO) and enacted through a data governance team, supported by a Data Governance Council. Often, the CDO will draft an initial data strategy and data management strategy even before a Data Governance Council is formed, to gain senior management’s commitment to establishing data stewardship and governance
Many organizations recognize that their data is a vital enterprise asset. Despite that recognition, few organizations actively manage data as an asset from which they can derive ongoing value. Deriving value from data does not happen in a vacuum or by accident. It requires intention, planning, coordination, and commitment. It requires management and leadership.
Data management (DM) principles are usually formulated during data strategy development as part of a DM framework and are the highest order propositions that serve as the foundation for DM.
Typically, the first principle within the DM framework will state something like data is recognized as a valuable, enterprise-wide asset. The DM principles within the DM framework are supported by policies, standards, and controls.
DM policies are short statements of management intent and fundamental rules governing data from creation through to destruction. Standards and controls are the measures and mechanisms by which the enterprise maintains the acceptable performance of its DM.
A Data Management (DM) professional is any person who works in any facet of DM (from technical management of data throughout its lifecycle to ensuring that data is properly utilized and leveraged) to meet strategic organizational goals. DM professionals fill numerous roles, from the highly technical (e.g., database administrators, network administrators, programmers) to strategic business (e.g., Data Stewards, Data Strategists, Chief Data Officers).
Enterprise Data Architecture defines standard terms and designs for the elements that are important to the organization. The design of an Enterprise Data Architecture includes depiction of the business data as such, including the collection, storage, integration, movement, and distribution of data.
Ideally, Data Architecture should be an integral part of enterprise architecture. If there is not an enterprise architecture function, a Data Architecture team can still be established. Under these conditions, an organization should adopt a framework that helps articulate the goals and drivers for Data Architecture. These drivers will influence approach, scope, and roadmap priorities.
The principal goals are to:
Data Storage and Operations includes the design, implementation, and support of stored data, to maximize its value throughout its lifecycle, from creation/acquisition to disposal.
Data Storage and Operations includes two sub-activities:
Database administrators (DBAs) play key roles in both aspects of data storage and operations. The role of DBA is the most established and most widely adopted data professional role, and database administration practices are perhaps the most mature of all data management practices. DBAs also play dominant roles in data operations and data security.
Data modelling is the process of discovering, analyzing, and scoping data requirements, and then representing and communicating these data requirements in a precise form called the data model. Data modelling is a critical component of data management. The modelling process requires that organizations discover and document how their data fits together. The modelling process itself designs how data fits together.
Data models depict and enable an organization to understand its data assets. exist at three levels of detail: conceptual, logical, and physical. Each model contains a set of components. Examples of components are entities, relationships, facts, keys, and attributes. Once a model is built, it needs to be reviewed and once approved, maintained.
Within the context of databases, a data definition language (DDL) to used define a data model; a data manipulation language (DML) to insert/update/delete data records and data control language (DCL) is used to control access to data. Data query languages (DQLs) simply the process of retrieving data records which meet the criteria of interest (e.g. all customers in arrears) without the need to know how the data is stored in the database.
Reference data is used to organize or categorize other data, or for relating data to information both within and beyond the boundaries of the enterprise. Usually consists of codes and descriptions or definition. For example, Order Status: New, In Progress, Closed, Cancelled; may incorporate external standards, e.g. postal address or industry classification; it may be used for computation, e.g. currency exchange rate; it may also be quite abstract, e.g. synonym list.
Master Data, which encompasses reference data, is data about the business entities (e.g., employees, customers, products, financial structures, assets, and locations) that provide context for business transactions and analysis. Entities are represented by entity instances, in the form of data / records. Master Data should represent the authoritative, most accurate data available about key business entities. When managed well, Master Data values are trusted and can be used with confidence.
In most organizations, various projects and initiatives, mergers and acquisitions, and other business activities result in multiple systems executing much the same functions, isolated from or uncoordinated with each other. These conditions inevitably lead to inconsistencies in data structure and data values between systems. Both can be reduced through the management of Master Data and Reference Data.
An Enterprise Data Warehouse (EDW) is a combination of two primary components:
Data in placed in the EDW through a so called “Extract, Transform, Load” or ETL process. A properly designed ETL system extracts data from the source systems, enforces data quality and consistency standards, conforms data so that separate sources can be used together.
Big data is a term that has become associated with warehousing where the volume of data poses specific challenges that are unique to large scales of data, which are often solved through a range of techniques including parallel and distributed processing.
In many respects big data can be conceived simply as “bigger” data. Despite Moore’s law and the rapid progression of technology, the amount of data that can be captured, processed, and analyzed, relative the amount of data in the observable world, is small.
Analytics, which is mostly associated with the decision-making objectives of data management, is really part of a decision support. Analytics, an evolution of the decision support arena encompasses all creation, cultivation, and curation of insights..
Integration and Interoperability describes processes related to the movement and consolidation of data within and between data stores, applications, and organizations. Integration consolidates data into consistent forms, either physical or virtual.
Data Interoperability is the ability for multiple systems to communicate. DII solutions enable basic data management functions on which most organizations depend:
Document and Content Management entails controlling the capture, storage, access, and use of data and information stored outside application (primarily relational) databases.
Its focus is on maintaining the integrity of and enabling access to documents and other unstructured or semi-structured information which makes it roughly equivalent to data operations management for relational databases. The key goals are to:
In many organizations, unstructured data has a direct relationship to structured data. Management decisions about such content should be applied consistently.
In addition, as are other types of data, documents and unstructured content are expected to be secure and of high quality. Ensuring security and quality requires governance, reliable architecture, and well-managed Metadata.
Metadata describes data (e.g., databases, data elements, data models), the concepts the data represents (e.g., business processes, application systems, software code, IT infrastructure), and the connections (relationships) between the data and concepts. An example of metadata is the information on a catalogue card in a library describing a book, or a list of column names in a database table.
Metadata management is the end-to-end process and governance framework for creating, controlling, enhancing, attributing, defining, and managing a metadata. The Dewey Decimal Classification is a metadata management systemsdeveloped for libraries.
No organization has perfect business processes, perfect technical processes, or perfect data management practices; consequently, all organizations experience problems related to the quality of their data. However, organizations that formally manage the quality of data have fewer problems than those that leave data quality to chance.
Formal data quality management is like continuous quality management for other products. It includes managing data through its lifecycle by setting standards, building quality into the processes that create, transform, and store data, and measuring data against standards. Managing data to this level usually requires a Data Quality program team.
The Data Quality program team is responsible for engaging both business and technical data management professionals and driving the work of applying quality management techniques to data to ensure that data is fit for consumption for a variety of purposes. The team will likely be involved with a series of projects through which they can establish processes and best practices while addressing high priority data issues.
Data Security includes the planning, development, and execution of security policies and procedures to provide proper authentication, authorization, access, and auditing of data and information assets.
The specifics of data security (which data needs to be protected, for example) differ between industries and countries. Nevertheless, the goal of data security practices is the same: To protect information assets in alignment with privacy and confidentiality regulations, contractual agreements, and business requirements.
The goals of data security activities include:
Monitoring and logging are fundamental to the exercise of authority, control, and shared decision making (for planning, design, operations monitoring, and enforcement) over the management of all enterprise data assets.
Monitoring and logging are intrinsic to many aspects of data management including:
Operationally, real-time database monitoring tools automate monitoring of key metrics, such as capacity, availability, cache performance, user statistics, etc., and alert administrators of database issues.
More broadly a monitoring strategy will identify what needs be monitored; how it will be done (e.g., native resource monitoring)
Congratulations, your are done with the assessment questions!
In order to better interpret your responses and gain a further understand your perspective, please answer the following questions.