Target Information Model: Mastering data architecture to enable digital transformation
Organizations across industries are accelerating their implementations of digital technologies as internal and external demands place greater stress on information pipelines. Customers want more access to information, regulators require greater transparency, and executives need better analytics to run their businesses. But before organizations can actually begin to digitally transform to meet these demands, they first must understand what needs to change. In this article, Nicolas Papadakos highlights the crucial role of a target information model (TIM), serving as not only a data management and storage foundational guideline, but also the underlying communication and collaboration structure of the entire organization.
Whether focused on finance, commodities or real estate, businesses are seeking to transform their infrastructures, processes, reporting and customer interactions through digital technologies. Risk mitigation, the impact of hiring millennials, internal forecasting and reporting, investor and client demands, along with the desire to capture efficiencies while minimizing headcount, are crucial factors in driving adoption and testing of digital technologies. However, with challenges on several levels, many of these digital endeavors fail to get off the ground or achieve what they set out to accomplish.
A big driver of this phenomenon is the ability, or lack thereof, to become a master of data management. Organizations are finding it a challenge to define or develop information architecture (centralized or distributed) that allows for more effective management, storage, reporting and reconciliation of data without the proper information models in place. What may be even more problematic is articulating an enterprise-level plan for achieving related progress.
Unlocking the Power of Information
The information model is not simply data management. It is the gas powering the operating modelís engine, enabling organizations to more effectively communicate and reach their specific goals. When an information model is properly established, it provides organizations with two distinct benefits. For one, it defines how, where, in what format and for what usage an organization manages and stores data from a technical and systemic point of view, as well as how that translates into personal understanding, utilization and comprehension. That means it is not just solving how internal computers store data, but how internal staff and end-consumers are accessing, visualizing and comprehending data.
Historically, organizations have relied on static information models that tell the enterprise where the information lies, how it is structured and where it lives (i.e., a system of record). Today, a dynamic information model is more desired because it offers greater business value.
A dynamic information model shows where the organization generates data in and out of the company, how data gets through integration layers, and where data is disseminated across the enterprise. If a trade is initiated with basic information, different parts of the organization need to understand who initiated the trade, where it is going, if there were any updates on calculations to ensure reporting is accurate, and whether or not the information is easily accessible for future auditing.
Mismanaged or poorly implemented information models not only fail to provide value, they also slow down the overall system or process in place. Latent information can be almost as risky as inaccurate or missing information. The right information must be available, unnecessary data must be purged, and cross-system taxonomy translations must be actively maintained to eliminate duplicate information created in parallel. With standards in place, organizations can achieve cross-firm optimization, while provisioning existing data as a value-add to business areas at a lower cost.
The second benefit of an information model relates to permissions to view, write over and read the information. The information model should determine when and who has to provide the information to whom, and if the information predicates someone else’s accessibility to use of the information. This process includes performing verifications such as technical reconciliations that ensure certain batch processes are completed and operational reconciliations confirm the accuracy of data elements.
The information model further informs what each area does and how they all relate to each other. Without it, organizations can run into several issues, including simple awareness about whether a specific piece of data is captured or available, if the person accessing the information can understand it, if people know where to access the data, or if people accessing the information should actually have permissions to do so. Inefficient data access becomes a major issue for organizations that are overhauling their data taxonomies and information model to meet new regulations (e.g., MiFID II or FRTB) or those relying on real-time sensor data to augment control center operations such as power plants, pipelines or air traffic.
Aligning with a Business Strategy Framework
A target information model (TIM) defines an organizationís data architecture by considering it from different viewpoints, such as:
- Operations—How, where and when data is used and propagated and by which
- Governance—Which staff members should have access to specific data and who
is responsible for input, verification and oversight
- Content—Data that must be included, tested, checked, scrubbed and transformed
- Quality—Accuracy and minimum viability for usage and requirements for validity
To address the underlying data management, storage and communication issues, it is important to embrace an overall business strategy framework as a first step. The framework is a set of defined architectural principles that enable change, transformation and growth. These principles include the target operating model (TOM), which illustrates how a business is organized and performs tasks; the target architecture model (TAM), or the systemic architecture that underlines the technology and systems a firm uses to complete tasks; and the TIM, which is shows how a firm stores, manages, comprehends and views the flows of data across the organization.
Not meant to exist in isolation, the TIM prescribes guidelines and principles for an overarching data architecture, and is interconnected with the TOM and TAM. Without defining these principles, it is difficult to create the information model unless there is a clear line of sight into the current-state operating model. Without a TIM, an organization cannot define its TAM.
A joint process across business lines can create an information hub and eliminate fragmented systems and processes. The better organizations can master these principles and models, the better they can optimize the speed and sharing of information without sacrificing security, privacy or controls.
Different Pressures, Same Need
Outside of the internal reporting and external regulatory pressures, there are other challenges affecting data flows.
One big pain point for financial institutions and commodities companies is their clients wanting to go digital. Clients want instant access to information, portfolio performance or home energy consumption on their smart phones, on a website and in an email. They want to understand what the information means to them both now and in the future, making the manner in which this information is displayed a vital component. But how can a company deliver information that is both timely and personally meaningful if many of the current processes for distributed insights are batch-driven? How can an institution deliver information that is predictive when they are unsure of the state of the data they are relying on?
Another key challenge is simply the sheer breadth of operations of banks, asset managers, and energy and commodities companies. Most of these organizations span many different lines of business, often with multiple subsidiaries that carry out similar work across geographies. To support that structure, organizations typically have canonical data models backing each business line or subsidiary. For example, if an organization has one ledger or system, they may have different views of that system to support different areas of the company (e.g., institutional, retail, mortgage lending) across geographies.
Similarly, a commodities trading firm may use multiple trade capture systems for various commodities, such as power and gas in one system, crude oil in another, and financial interest rates in a third system. Or it may have different trading systems for North American and European power that require a comprehensive view of global exposure. How can an organization compile that information for internal analytics (e.g., assets, liabilities, P&L), compliance and auditing if there is not a unified or common view? This becomes an issue of translation and concatenation, leading to a miscomprehension of a company’s overall risk.
Furthermore, areas like over-the-counter (OTC) derivatives, securities clearing and collateral, and industry bodies such as SWIFT and FPML are introducing new data taxonomies, languages and standards. This requires organizations to not only normalize their current proprietary information models, but also adhere to industry-wide standards. It also requires that organizations be able to define how and what information should be shared, and when to avoid sharing it with the wrong person, at the wrong time or with the wrong data.
Defining a TIM requires an understanding of the organization’s specific business needs and challenges. Whether data within the organization is changing frequently, or that data is trapped within multiple underlying vendor systems or off-the-shelf solutions, there are options that make sense depending on the organization’s specific structure and business requirements. These factors will ultimately determine how to implement the information model from both a technical and business requirements perspective to establish a common view of information across the enterprise.
Nicolas Papadakos is a Director at Sapient Global Markets in Houston, focusing on data. He has more than 20 years of experience across financial services, energy and transportation. Niko joined Sapient Global Markets in 2004 and has led project engagements in key accounts involving data modeling, reference and market data strategy and implementation, information architecture, data governance and data quality.