Research & Innovation

Stevens Professor’s Research on Financial Data Standards Spotlighted in Leading Financial Markets News Site

New position paper is part of a prestigious collaborative research grant from the SWIFT Institute

In 2012, a reported $6.2 billion in outsized credit default swaps (CDS) was accumulated by the “London Whale,” a result of a strategy that went awry for a global financial firm. 

This event was one of the most noteworthy miscues in the financial industry following the 2008 financial crisis and it triggered investigations by multiple U.S. agencies, including the Commodities and Futures Trade Commission (CFTC). 

But there was a big problem – the regulatory agency struggled with going through volumes of swaps trading data. 

The agency had to work “through a morass of data to work out what happened,” notes Dr. Steven Yang, a Stevens Institute of Technology assistant professor of financial engineering, in a recent article in FTSE Global Markets, a leading publisher and news provider in the capital markets industry. He explains the dynamics of data standards in the financial industry throughout the article.

The article, co-authored by Dr. Suzanne Morsfield of the Columbia Business School, highlights findings from a recently published position paper, “A Critical and Empirical Examination of Currently-used Financial Data Collection Processes and Standards,” which aims to answer questions about the use of current standards and describe the possible future of financial data standards. 

Their findings are part of a collaborative research effort. Both were awarded a grant from the Society for Worldwide Interbank Financial Telecommunication Institute (SWIFT) in 2014 to research financial standards and big data issues related to financial information quality. The SWIFT Institute brings together members from industry and academia to foster independent research to address the future needs in global financial services.

Searching for answers in financial data

The ability to analyze voluminous data sets quickly, and extract value, is critical to regulators. It helps them identify anomalies in financial data disclosures to ensure markets and investors are protected. 

Yet the varied current financial data standards used today – FIX, FpML, ISO 15022/20022, eXtensible Business Reporting Language (XBRL) and Swaps Data Repositories – do not share a common nomenclature, asserts Dr. Yang whose research intersects XBRL technology, text analytics and finance.

Messaging between financial institutions to execute transactions “need to be in the same language,” otherwise, there is room for error. And because big data is as ubiquitous in the banking and financial markets, as it is in the realm of information technology, increased robustness of financial data standards is increasingly relevant to an evolving industry.

“Through our research, we seek to enhance and develop financial standards to help ensure more timely analysis of financial data, and understand and design systems solutions to improve decision-making for financial centers and solve data analysis problems,” says Dr. Yang.

In the article, Dr. Yang notes that the standards of the future may involve more than a traditional data model. It will require “high caliber analytical tools that can be used to quickly analyze vast quantities of data,” he said. However, future financial data standards should be flexible enough for the varied technological approaches.

The ultimate aim of the research effort is to develop a common model for understanding and evaluating financial data standards.

“Improved financial data standards and data analytics would go a long way in searching for the London Whale, or any other big rogue fish for that matter,” notes Dr. Yang.