now loading...
Wealth Asia Connect Middle East Treasury & Capital Markets Europe ESG Forum TechTalk
TechTalk / Treasury & Capital Markets
A call for open-source data standards
Collaboration a must for financial market players to make better use of information deluge
The Asset 19 Jan 2023

Financial industry players must step up collaboration towards a more efficient system of data exchange and management to exploit and connect massive amounts of information to produce new insights across firms and markets, a new report says.

The Depository Trust & Clearing Corporation (DTCC), a New York-based post-trade services provider, calls for open-source data standards, which will allow more users to get access to broader data more easily and put them to use much faster.

In a white paper, Data Strategy & Management in Financial Markets, DTCC notes that current data exchange standards typically assume point-to-point communication with asset class-specific and inflexible formats, as well as bespoke data models.

This has limited the ability of firms to explore the interlinkages of data. Meanwhile, upgrades, including expansion or harmonization of data fields, require lengthy consultation processes, industry consensus, and costly implementation.

Thus, there is a need for more flexible data sharing, enabling data producers to send data to many users or users to retrieve data at their convenience in a standard format. This flexibility will let users create simpler workflows and lower technology spend that will foster advanced analytics and innovation, the report says.

Data silos

“As new technological advancements, including broad adoption of cloud technology, spark an evolution of global markets, the financial services industry has an opportunity to reimagine how data is exchanged and managed across financial markets and firms,” says Kapil Bansal, managing director, head of business architecture, data strategy and analytics at DTCC.

“For many years, companies have collected massive stores of data, but the siloed nature of data and the need for better data quality limited the ability of market participants to extract strategic insights to support more effective decision-making. We’re now at a unique moment where we can make this vision a reality, but long-term success hinges on market participants taking action to ensure their data strategy can meet the demands of a highly digitalized and interconnected marketplace.”

Currently, information in the form of metadata (i.e., descriptive data that captures attributes of the underlying data) is often missing or embedded in specific data stores of applications, which significantly limit how broadly the data can be used and re-used in new ways.

The emerging best practice is to store data, including metadata, separately, in dedicated locations, akin to a virtual library, so it can be accessed by many applications for many purposes, the white paper says.

In addition, “data tagging or cataloguing” can be applied to provide additional context to data items (e.g., privacy attributes). This would let users within an organization innovate without having to search to find data the organization already possesses. Additionally, these enhanced data tags can also be used to allow external parties to discover the properties of proprietary data sets without the need to “see” the actual data.

Data quality

At present, many data sets that financial institutions rely upon are not of desired data quality to support decision-making, let alone automated decision-making, DTCC says.

Data quality is often difficult to ascertain – more so in the case of commercial data, which, to date, have not undergone as much scrutiny as risk and regulatory data.

At worst, incomplete or unvalidated data sets at times are used to support business decision-making. At best, an organization can expect lengthy and costly analytics development lifecycles, which often result in standard aggregated reports – not dynamic and predictive analytics tools.

For financial institutions to become faster moving and more agile, they will need the ability to perform larger volume and more flexible ad-hoc analyses, at lower cost.

Additionally, new products and resilience can be developed by sharing information across firms – e.g., to develop more robust risk analyses and models. This need has in the past often been addressed by data brokers and vendors, who for a large premium turn consumers’ own information into usable analytics.

Increased flexibility

DTCC’s white paper details four scenarios that will drive how data are used in financial markets in the future:

  • Data will be more accessible and secure. Data users will have increased flexibility in determining how and what data are received at desired times. To enable this, data governance, privacy and security will need to be prioritized.
  • Interconnected data ecosystems as a new infrastructure layer for the financial industry. Industry participants will free their own data from legacy systems and be able to pool them into data ecosystems and connect those ecosystems to others. This will reduce duplication of data across the industry and allow for the co-development of innovative data insights.
  • Increased capacity to focus on data insights. More efficient data management, cloud-enabled capabilities, and further automation of routine data management tasks will free up capacity and accelerate time to market for new product development, reducing the need for specialized data analysts and data operations teams to focus on deriving insights from vast stores of data.
  • Ubiquity of open-source data standards. It is anticipated that the industry will continue to adopt more standards around data models, with the most viable use cases being reference and transaction reporting data. This will result in increased operational efficiency and better data quality.

To enable these changes, the white paper suggests institutions that produce and consume significant amounts of data embed key principles into their data operating models, including:

  • Establishing robust foundational data management capabilities, including having a thorough understanding and catalogue of data, breaking down data silos and implementing robust data quality practices.
  • Supporting strong data governance, including the right set of data privacy and security standards to enable data collaboration with partners.
  • Exploring where there is mutual benefit from collaborative data environments across firms and the industry to advance interoperability.

Applying these principles will help market participants gain access to data that are trapped or underutilized today and allow for new and faster insights, DTCC says.

“Building the future of data exchange and management will require close consultation and coordination among industry participants and service providers, including standardization in how data is managed and shared,” Bansal stresses.

Conversation
Victor Cheung
Victor Cheung
director, ETF investment strategist
Mirae Asset Global Investments
- JOINED THE EVENT -
Webinar
Developing strategies supporting sustainable investing
View Highlights
Conversation
Alex Kim
Alex Kim
CEO
Upbit APAC
- JOINED THE EVENT -
Webinar
The future of digital assets
View Highlights