Summit Unites Supporters of Modernized Financial Regulatory Data


Last week, the Data Coalition hosted a record-setting Financial Data Summit in Washington. We brought together over 300 supporters of transforming U.S. financial regulatory reporting from disconnected documents into standardized, open data.

Financial DATA Summit02

From left to right: SIIA Policy Director David LeDuc, Workiva Vice President Mike Starr, and Data Coalition Executive Director Hudson Hollister.

Coalition’s founder and executive director Hudson Hollister set the stage by reminding everyone that the campaign to standardize financial regulatory reports is a long-term one. For Hollister, it began eight years ago, when he worked for data standards within the Securities and Exchange Commission. “[Today’s event] demonstrates a growing consensus in favor of that transformation,” he said. “We are seeing the apostles of [open] data spring up amongst financial regulators!

Workiva vice president Mike Starr, a former SEC deputy chief accountant, asked the audience to “walk away with an appreciation that this transformation is not about tech. We have the tech. The real challenge is changing the cultures in the agencies.” Starr shared three components of monumental policy change: years of perseverance, the fostering of cross-sector partnerships, and real participation by all in the policy making process.

Starr’s three components were abundantly in evidence at the Financial Data Summit.

Through appearances by over thirty speakers, demonstrations by dozens of tech companies, and participation by over 20 federal agencies, four themes emerged.

  1. A Collective Action Problem

Transforming financial regulatory reporting from documents into standardized data promises huge potential gains – but also raises a collective action problem.

Allan Mendelowitz, president of the ACTUS Financial Research Foundation, and Richard Berner, director of the Treasury Department’s Office of Financial Research, opened the Summit with rousing speeches on the huge potential benefits of data standards in financial regulation.

The 2008 financial collapse of the U.S. housing market is still fresh in the minds of policymakers and a driving force behind policy decisions. Mendelowitz, who helped lead the Dodd-Frank reform effort, lamented that the collapse “was due to practices based on assumptions, not based on facts and analytics.” The one industry where the application of technological advances has been “applied most unevenly is finance – specifically in [measuring] financial risk.”

Assumption-based risk assessment completely lacked insights into the interconnectedness of financial institutions. “Just collecting [financial information] is not sufficient for effective regulatory oversight,” said Mendelowitz. That information must be analyzed.

If financial information isn’t expressed as standardized data, it cannot be analyzed.

Financial DATA Summit03-2

Allan Mendelowitz, president of the ACTUS Financial Research Foundation, delivers the plenary address.

Financial contracts are expressed almost entirely in words, not data. According to Mendelowitz, what’s needed is “a way to represent all financial contracts in a computable, mathematical form,” so that “regulators and risk managers can collect data that gives real insights into risks” when analyzed. Mendelowitz’s ACTUS project (see his presentation) attempts to do just that: interpret all financial obligations as standardized data.

But unless they are adopted across the financial industry, ACTUS’ data structures won’t do any good. And the markets won’t do so on their own, without leadership from government. “Data standards are public goods. [But] … the public sector doesn’t produce them.”

The good news? If the government acts first to embrace standards like ACTUS’, the private sector will follow.

So is the proposed Financial Transparency Act – which would direct regulators to adopt data standards for the information they collect – the solution? Mendelowitz hinted as much, but stressed the importance of getting the legislation right, since policy change happens so infrequently.

Financial DATA Summit05-2

Dick Berner, Director of Office of Financial Research delivers the executive keynote, moderated by Booz Allen Hamilton Principal Bryce Pippert.

Director Berner also drove home the need for public sector action to overcome the collective action problem. “[T]o overcome the hurdles of up-front costs, we as policymakers have to mandate their use or make the case for adoption so compelling that adoption makes sense.” Berner also shared how the application of standards enables “precise conversations,” a key component of any regulatory regime (see his presentation).

But do supporters of the Financial Transparency Act offer a sufficiently compelling case?

In his opening remarks, David LeDuc, policy director of the Software and Information Industry Association, observed that “some of the most valuable data in the whole ecosystem is not organized.” LeDuc offered: “the [Financial Transparency Act] is the obvious vehicle for [fixing] that on Capitol Hill.”

Tim Lind, global head of financial regulatory solutions for Thomson Reuters, added, “Policies and regulations are typically made by lawyers…[who] can deal with a certain level of ambiguity. Financial data scientists cannot.”

According to Lind, without predictable data standards for financial regulatory filings, analysts can’t conduct the level of analysis that allows them to be proactive. “Forensic analysis is not good enough.”

To protect against risk, financial regulatory filings need to be organized into open data that can be used for proactive – rather than forensic – analysis.

Only a mandate from government – the Financial Transparency Act – can do that.

  1. At the SEC: Incremental Progress

Most of the Securities and Exchange Commission’s filings are still expressed as documents, not as standardized data. But the agency is slowly improving the quality of the standardized data that it does collect.

Before Congress is able to act on a legislative solution, advocates for standardized data must support existing efforts within the SEC to improve the standardized data it is already collecting.

In 2009, the SEC adopted the XBRL open data format for corporate financial statements. But the agency never stopped collecting financial statements in the old-fashioned document form. Public companies must submit every financial statement twice: once as a document, and again as open data. (Beyond the financial statements, most of the SEC’s filings are still expressed solely in document form.)

Financial DATA Summit19

Mike Willis, Assistant Director at the Office of Structured Disclosure, SEC, discusses structured data analytics at his agency.

With two versions of every financial statement available, the SEC has rarely enforced the quality of the open data version. Andres Gil, director of the U.S. Chamber of Commerce’s Center for Capital Markets Competitiveness, warned that quality control presents real challenges towards the wider utilization of the SEC’s financial data.

But Mike Willis, associate director of the structured data office at the SEC’s Division of Economic and Risk Analysis, described a renewed commitment toward improving
data quality. Willis admitted the challenges presented by numerous data quality errors: “[I]f we are trying to put this data into analytical engines, this is the equivalent of sand in the gas.”

Willis hinted that the SEC will soon move away from its duplicative documents-plus-data regime by adopting the inline XBRL format, which is both human-readable and machine-readable. He offered the Summit audience a peek at the agency’s new text analytics initiative, which illuminates the worst quality problems. And he rolled out his division’s new website, which offers more quality guidance to corporate filers.

  1. Lots of LEI Love

Everybody–including the financial industryloves the Legal Entity Identifier (LEI).

Identifiers panel (from left to right): Rich Robinson, Bloomberg; Daniel Meisner, Thomson Reuters; Brian Williams, Dun & Bradstreet Roger Fahy, CUSIP Global Services, David Blaszkowsky, State Street.

Identifiers Breakout (from left to right): Rich Robinson, Bloomberg; Daniel Meisner, Thomson Reuters; Brian Williams, Dun & Bradstreet; Roger Fahy, CUSIP Global Services; David Blaszkowsky, former State Street Bank chief data officer.

The Financial Data Summit featured universal acclaim for the LEI, an electronic identifier being adopted by dozens of regulators around the world to identify their regulated entities. (The Office of Financial Research is the lead U.S. agency in a global committee that supports its adoption.)

The LEI allows automatic aggregation of everything a particular company files with different regulators – if the regulators have adopted it.

Joining via video conference, Lewis Alexander, chief U.S. economist at Nomura Securities, stated bluntly that the LEI “has the potential to create order out of chaos” and urged agencies to “take advantage” by making the LEI mandatory for entities they regulate.

The Office of Financial Research’s Berner gave an LEI endorsement during his keynote, describing it as “the best identified tool” for making government data useful for analysis. Berner even hinted that he believes the U.S. government should use the LEI outside financial regulation – to identify federal grantees and contractors, for instance.

Berner pointed out that his Treasury Department is “not in the business of proliferating standards for the sake of standards…our interest derives from need…[and] the process we used [to help create] the LEI is a good process since it aligns the interests of both [the financial] industry and policy makers.”

And what of the LEI’s benefits to risk analysis for this post-2008 economic world?

Srinivas Bangarbale, chief data officer at the Commodity Futures Trading Commission, said the “LEI has been the leading identifier [for derivatives market participants], and it’s the one we support.” And in his remarks, Allan Mendelowitz noted how the proposed adoption of the LEI would allow more accurate aggregation of counterparty risks and exposures.

The LEI isn’t just a regulator tool. It also aids data management in the banking sector.

Citigroup managing director Ari Marcus relayed how his bank mapped the LEI to its internal customer and counterparty codes. By resolving internal ambiguities, the LEI has allowed traders in Citigroup’s back office to communicate with their colleagues in the front office. Citigroup’s use of the LEI has allowed the bank “[to] identify who we are doing business with [and] allows adoption of standardized [internal] processes since we are working with standard information.”

With wide acclaim from regulators and the financial industry, it’s surprising that most U.S. agencies have made the LEI voluntary, not mandatory (see table). This failure to adopt the LEI for all their reporting requirements is further evidence of a collective action problem.

The Financial Transparency Act would provide a strong, explicit mandate for every financial agency to adopt the LEI for the entities it regulates.

  1. Modernization Needs a Legislative Mandate

Agencies are trying to align their data formats, but Congress needs to clarify the mandate to do so.

By the end the day, Summit participants were heartened to see a display of cohesion amongst the closing panel, which featured chief data officers who all endorsed the notion of common data standards.

The CFTC’s Bangarbale of the CFTC confirmed that regulators are trying to coordinate standards across their different reporting regimes. “[A]pplications of systems come and go, data remains the same,” he concluded.

Financial DATA Summit12-2

Data Management Panel (from left to right): Srinivas Bangarbale, CFTC; Gwendolyn Mitchell, Federal Reserve; Justin Stekervetz, Office of Financial Research, Treasury Department; Linda Powell, CFPB.

Federal Reserve metadata manager Gwendolyn Mitchell spoke of her work to build out a comprehensive data catalogue for the Fed’s voluminous, complex collections of banks’ information. A comprehensive data catalogue would make these resources internally searchable and computable – and also open up the Fed filings that are public. Such a catalogue doesn’t exist yet, but the Fed’s data office is trying.


Consumer Financial Protection Bureau chief data officer
Linda Powell offered that “[what] helps [protect] consumers the most…involves data.” But mining the text of consumer-filed complaints presents challenges. Only standardized “definitions enable clear communication.” Powell’s CFPB has developed a new loan identifier to monitor every mortgage throughout its life.

But the laws governing financial regulators presume that they collect information as documents, not as open data.

A legislative mandate would provide data leaders like Bangarbale, Mitchell, and Powell with the wherewithal to achieve modernization faster. The Financial Transparency Act will provide that mandate.

The Coalition’s Hollister closed the Summit with three predictions:

  • This spring, the SEC seems poised to move away from its current duplicative documents-plus-data reporting requirement for corporate financial statements and embrace a single combined open data format – iXBRL – that is both human and machine-readable.
  • This year, support for open data in financial regulation will grow in Congress, and the Financial Transparency Act will become law in the early days of the next administration.
  • Within five years we will see a comprehense data structure that unifies all the information collected by U.S. financial regulators.

The Data Coalition will work tirelessly to make these predictions real.

This video explains how the Financial Transparency Act (H.R. 2477) will transform financial regulation – and bring powerful new tools for the investors, companies, and researchers who use the valuable data collected by financial agencies!