First it was the enterprise data model, then data warehouse, enterprise architecture, and finally master data management that became the consultancy solution mega implementations. Each has run its initial course with, at times, very limited results. Some organizations have abandoned methods that demand high levels of investment with the promise of future payback. Today, the pace of change requires incremental, quick-paced action to implement something of value.

To this end, Estrada utilizes a backdrop of conceptual materials such as reference architectures (RAs) and build micro-methods to facilitate and innovate quickly with clients to establish incremental capabilities. An example to consider from the California Enterprise Architecture Framework (CEAF) is below. This is a Business Intelligence Reference Architecture within the CEAF.

The problem with these RAs would be the difficulty in determining what to put in place and why; difficulty in determining the right order for delivering incremental benefit, and the difficulty in managing to implement meaningful pieces that can aggregate to something operationally beneficial. Estrada, then, has defined many smaller methods (some of them below) to address such a situation. Consider the following as a set of micro-methods.

Domain Analysis: each business has a defined set of required results. By understanding the organization’s required results, and what others in the industry also see as key information to achieve similar results, this activity results in comparison benchmarks to help the organization know how far they are away from their required achievement (gap analysis), and the rate at which change is likely in their business domain. This method defines organization performance needs, available benchmarks, performance gaps, and a recommended pace for closing the gaps in a priority sequence.

Data Availability Diagnostic: by comparing the responsibilities of the organization and their associated outcome expectations, this activity drills down to the total set of systems that support operations. The activity includes identifying entities of data owned by the organization, or owned by others, that are needed to perform as required to achieve results. Gaps in system coverage and transaction capability are identified as are the required interfaces with partners to obtain the data necessary. Data needs can also be prioritized as determined by the organization strategy. This method defines data gaps, required data sources, and prioritization for implementation.

Data Integration: integration can be a significant burden if not orchestrated correctly. The right data needs to be obtained at the right time and used in the right way to preserve the integrity of its use. N-squared diagrams were popularized in the federal sector. These are simple diagrams that show data flows between systems, between modules or components of a single system, or both. The most important benefit from the use of such a diagram is to establish the dependencies for data. The diagram can also help to identify ambiguity when more than one system claims to be the source for certain data. Once defined and finalized as the strategy for data integration, there are integration patterns that help to identify the few methods acceptable to the parties involved based on the availability of data and the timing of the need. This method defines the appropriate patterns for data integration for an organization situation.

Data Access: access to data needs to be planned. Security should prevent all access except what is explicitly allowed by the business owner of the data. Access to data, then, is a primary responsibility of data governance. Data access, then, is much like a contract. The contract needs to specify the conditions under which the data is created, protected, and made available as well as some mandatory notice period before which changes could be made to any material part of the contract for data. Patterns, timing, volume, frequency, security, semantics, and other topics are valid needs for contract specification. This method defines the appropriate patterns of access for a given data situation for an organization.

Data Migration: once an organization has an agreement to obtain data, the act of migration is the physical implementation of the contract to move data from the source to the target with associated and designed changes in the data as used in the target system. This method defines methods and frameworks for use in data migration based on an accepted strategy in given situations.

Data Quality: data quality is often a struggle for organizations. There are principles involved such as: only the person who knows what the data is supposed to be at the time of its creation should have the ability to change the data. When correction at the source is not possible, downstream corrections need to be designed and approved by the owners to ensure the data still reflects what was intended when system issues cause situations where the data is not accurate, correct, or precise. An approved operational process is needed to ensure quality is established and maintained. Refer to the operational diagram below. This method defines the process and recommends methods for data quality based on organization dynamics.

Data Services: services have evolved significantly, yet take appropriate design considerations to be useful across an organization. The key is to avoid coupling with unique application functionality. Any store can serve the enterprise if the services remain generic and are appropriately controlled for access. This method defines appropriate data service standards and techniques for organization implementation.

Information Strategy: there are many occasions where the organization doesn’t know what information is needed to help a process succeed in achieving results. This method has been used in many industries. It is applied with a simple spreadsheet that lists process steps and roles that perform these steps. Then the information necessary for each step is defined with the frequency of the need. Again, each step and set of information is also mapped to the source system where this information can be sourced. At times, areas are not supported by systems and the organization needs to decide on the best strategy to get the information needed. This method gathers the process and role information as well as the information needed in the process, the frequency of the need, and the sources for the information.

Data Visualization: each type of business will likely have highly variable needs in how data is depicted to users to help with rapid analysis for their business need. Consider that there might need to be a set of capabilities in place concurrently. Dashboards, analytical ad-hoc reporting, published reports, data mining options, highly complex graphics, and other options might need to be demonstrated to help the business decide what is most appropriate to fit their needs. This method provides examples interactively with users to adjust to their needs before final solutions are built.

User-Positioned Data: unique data stores are, at times, needed to serve business objectives. The type and format of data positioned for business use should be designed for appropriate quality attributes, such as: availability, security, accuracy, timeliness, accessibility, and quality (fit for purpose), among others. This method defines the appropriate stores based on organization criteria to meet business needs.

Data Security: once categorized, the system of controls will need to be in place to protect data wherever it exists. Data security includes data loss protection that ensures the business Recovery Point Objective (RPO) can be met. This method identifies methods and defines a recommended sequence for implementation based on the required level of protection.

Data Categorization: categorization is the first step in data security. By following standards like Federal Information Processing Standard (FIPS) Publication 199, the data should be evaluated for potential impact if compromised in any way. The impact allows the organization to establish the appropriate controls for use of data to lower the risk of operations. This method applies categorization to an identified data set.

System Security Designation: once data is categorized, the organization should establish controls using standards like the National Institute of Standards and Technology (NIST) standard 800-53. This standard lists many controls helpful in reducing the risk of systems and includes controls for business as well as technology practices. The activity results in a set of recommended controls for the intended system. This method includes a process for iteratively implementing controls and assessing system security designations based on data owner tolerance.

Data Hub Strategy: often, organizations implement a solution that acts like a hub to feed data for any need without realizing the drawbacks associated with the structure or operation of the implementation. This method reviews the benefits of using a data hub and outlines appropriate implementations to meet objectives with planned fidelity for data.

Idea Bank: every organization has individuals who have ideas that are often very good; however, they might not receive the attention deserved due to the method in which they are expressed or explained. The idea bank is the place to capture the ideas so that the strong ideas from the inside are learned; then, expanded upon in ways to help them become effective, incremental, capabilities. This method initiates capture of client ideas to feed governance decisions.

Performance Logic for Outcomes: a lot has been demanded for business outcomes, yet the type of results specified still seem very elusive. Since outcomes are often the long-term goals, the logic for achieving them is most important to ensure that interim measures can demonstrate that the building blocks necessary to lead to desired outcomes are incrementally becoming effective. This method defines logic models for outcomes and interim measures to support logic and demonstrate progress.

Data Ownership Made Easy: most business leaders don’t realize they are responsible for the data within their business operations. Often, they try to place total responsibility for data on the information technology (IT) organization. IT, however, should never take action on data without direction to do so. This method is designed to work with business leaders to demonstrate how simple techniques can establish business standards for data that the IT organization is designated to implement as custodian and data stewards on the business side to help manage the integrity of the data for business purposes.

Minimum Concepts for Data Governance: governance is nothing more than a system of controls. In today’s age, control is no longer relegated to the highest level authority dictating what everyone should do and how they should act. A system of governance needs to be transparent and logical with the ability to use the outcomes as information and data to feed improvement in decision making capability. This method identifies a few minimum concepts for data governance to establish initially as the baseline means to operate in an improved environment for the use of data. Policies, standards, and controls will be defined for incremental implementation according to priorities.

What are your needs? As you define them, we likely have other micro-methods to help you be successful in rapid, small increments.