Log in

RegTech22 Data Summit: Leaders Discuss Efforts to Increase Engagement with the Federal Government

March 30, 2022 9:00 AM | Anonymous member (Administrator)

After a near two-year pause on in-person convenings, the Data Coalition Initiative hosted the AI Public Forum x RegTech22 Data Summit, sponsored by Donnelley Financial Solutions (DFIN) , as a hybrid event, with participants both in Washington D.C. and online. The event aimed to facilitate an environment to build a strong national data community and advocates for responsible policies to make government data high-quality, accessible, and usable – a central goal of the Data Coalition. 

The day began with a public forum on “Accelerating AI in the Public Sector,” where over 20 representatives from industry, academia, non-profits, state and local governments, and the general public shared perspectives and recommendations for how to best use AI in public sector regulation. The afternoon program, RegTech22 Data Summit, featured discussions focused on the intersection of regulatory technology (RegTech) and improving customer experience (CX). 

The RegTech22 Data Summit was designed with two goals in mind:

  1. Highlight areas where regulatory technologies improved processes and reduced burdens for data and information collection through regulatory processes; and 
  2. Showcase how innovations in data management and information usability have improved customer experiences. 

The Summit offered a wide range of experts from agencies and organizations, all representing different “customers” with varying missions and needs. Innovative solutions, such as data visualizations and collaborative databases to standardization of data and entity identifiers, demonstrated how data needs, information access and dissemination, customers, and collaboration supported the idea that there are solutions to celebrate and clear avenues where agencies can refine efforts.

The Summit program kicked off with a panel featuring Chief Data Officer (CDO) from the Commodity Futures Trading Commission (CFTC), Tammy Roust, illustrated the breadth of the commission’s customers by noting that we cannot buy a box of cereal without interacting with the work of the CFTC. Ultimately, she noted, “we serve [American taxpayers] by ensuring that the futures and derivatives markets are fair and transparent and have integrity.” Responding to the White House executive order to prioritize CX, the CFTC is building data visualization tools and open data portals under the guidance of the CDO office, making publicly available data more accessible while maintaining data privacies that are specific to the commission’s data feeds. 

Considering different “customers,” co-panelist Adam Scott, Director of Design and Development at the Consumer Financial Protection Bureau (CFPB), worked with the bureau’s Chief Technologist, Erie Meyer, to identify what information might be most valuable for the general public who visited the CFPB website. Maximizing response time, Adam and Erie engaged with design sprints. In those sprints they looked at broad problems, worked with internal experts to understand the problem space, hypothesize solutions, design prototypes, and test with real users in a short timeframe. The new approach allows the CFPB to respond rapidly, and as Adam summarized, “mak[e] sure that we’re constantly evolving and updating our site to meet the most critical needs of our users.” Highlighting how data access is also important for effective CX, Adam discussed how his team maintains the CFPB public data inventory. After identifying a distinct pain point that users did not have a means to discover all of the CFPB’s public data sets, CFPB built the simple webpage. 

In the Summit’s second panel on improving climate data, CDO of the Federal Energy Regulatory Commission (FERC) Kirsten Dalboe emphasized the role of agency CDOs as the ones responsible for overseeing proper data governance and use throughout the data lifecycle. “Effective data governance and a managed data lifecycle designates trusted data sources. It maintains a comprehensive data inventory. It defines data standards, determines policies and strategies for promoting enterprise data management activities, promotes data stewardship to facilitate data sharing, collaboration, and data quality,” explained Dalboe. Proper data governance allows agencies and organizations to know what they have and how it can be used to carry out their missions. 

Being able to correctly identify data needs, uses, and gaps are fundamental to building and sustaining the proper use of data, whether an agency or organization interfaces with the general public, regulators, academics, international organizations, investors, or utilities. Panelist Kristy Howell, Senior Economist at the International Monetary Fund (IMF), discussed the G20 Data Gaps Initiative, an IMF project that started after the global financial crisis with the goal to identify data gaps and how the IMF can work with the G20 economies to improve data. According to Howell, the data gaps project is expanding to include reviews on climate impact and macroeconomic data. Additionally, the IMF developed the Climate Change Indicators Dashboard, a truly collaborating database created to address the growing need for more information on climate and to assess how climate impacts the macro economy and how the economy is impacting the environment. As Howell and SAP Vice President and Head of U.S. Government Relations Kevin Richards stated multiple times – “you can’t manage what you can’t measure.” 

In a theme seen throughout the Summit, Howell continued, “data access and data sharing are important pillars” to collaborative problem solving efforts. Similar to remarks made by Dalboe, a common understanding of data elements and vocabulary – data standards – can help drive innovations and will support data-informed decision making. “Standards facilitate automation and streamlining of processes, which results in lower costs and more effective analysis,” said Mike Willis, Chairperson of the Regulatory Oversight Council (ROC) and Associate Director in Division of Economic and Risk Analysis at the Securities and Exchange Commission (SEC). 

Highlighting one area of improvement for data standards, Willis noted that federal agencies have over 50 legal entity identifiers unique to the agency. According to Willis, a standardized identifier like the Legal Entity Identifier (LEI), derivative identifier, and the Financial Instrument Global Identifier (FIGI), used for financial instruments, could connect information across regulatory agencies and international borders, simplifying compliance processes, bolstering Know-Your-Customer (KYC) requirements, and even enhance Zero Trust Architecture, which uses digital identities. In the case of Zero Trust Architecture, a structure required for federal agencies to adopt following an executive order on cybersecurity issued in 2021, the LEI offers that international standardized unambiguous identifier for legal entities issuing digital credentials – validating the digital credential and expanding traceability. Resulting in reduced costs and compliance burdens, the regulators and regulated entities would benefit from the use of standardized data and standardized entity identifiers.  

Expounding on how government is using technology to improve CX, Hugh Halpern, Director of the Government Publishing Office (GPO), began his keynote address by reminding the audience that GPO is responsible for producing and providing print and publishing services to all three branches of government, manufacturing everything from the federal register, congressional record, congressional bills, U.S. passport, and much more. Focusing on congressional documents and the GPO’s design evolution over the past 200 years, Director Halpern explained how the technology morphed from handset type to digital text editing, and while the design and accessibility of the documents adapted with available innovations they were not always user friendly. The introduction of USLM XML schema laid the digital foundation for exponential improvements of legislative documents. USLM feeds XML Publishing (XPub) and allows legislators and staff to more seamlessly track changes, compare document versions, and use the document for additional content development, “just like a normal text file!” While there are still areas of improvement, solutions like USLM have the potential to digest and smooth out issues related to unstructured data and text files that confuse and gum up communication lines across the legislative branch. Just like the work of the GPO, the future use cases for such technologies are not limited to one branch of government. 

The White House executive order on CX – Transforming Federal Customer Experience and Service to Rebuild Trust in Government – prioritizes how agencies engage with the American people, including lines of feedback. Speaking on the panel topic of “Improving Citizen Engagement and Transparency in Rulemaking Using RegTech Solutions,” Reeve Bull, Research Director at the Administrative Conference of the United States (ACUS); Virginia Huth, Deputy Assistant Commissioner, Federal Acquisition Service Office of Technology Transformation, General Services Administration (GSA); Kirsten Gullikson, Director, Systems Analysis and Quality Assurance at Office of the Clerk, U.S. House of Representatives; and Susan Dudley, Former Administrator of the Office of Information and Regulatory Affairs, White House Office of Management and Budget, each emphasized the importance of engaging with the general public with an effort to support transparency and to expand trust. 

The panelists explained how technology has the potential to help with this, but in regard to the rulemaking process, it must be applied in the right context. Dudley and Bull looked at potential areas of improvement in the e-rulemaking process, including the application of agile methods that would incorporate constant feedback loops for improvement. While not yet widely used, this approach has potential to increase public engagement and improve rollouts. To elaborate further, Huth offered a few examples of innovations she managed at GSA’s eRulemaking Program Management Office, referring to a proof of concept that combines machine-readable language, no code software, and natural language processing. Bringing together the “trifecta,” Huth explained, allowed GSA to build a technological approach that “enables machine-readable tasks that give us the ability to provide context,” – something that is particularly useful when helping regulators sift through public comments and review regulations, identifying out of date topics and contradictory regulations.  

Changes in government processes, data governance, and the strategic use of technologies will take time. As CFTC CDO Tammy Roust remarked, the fact that government does not match private sector pace of change is not necessarily a bad thing. Fellow panelist Darren Wray,  Co-founder of Guardum, a DFIN company, expanded on the point by noting that if something breaks in government transformation, the consequences can be catastrophic. Nonetheless, private-public collaborations and conversations like the ones hosted during the RegTech22 Data Summit provide a space for the exchange of ideas and perhaps insight into how available management approaches and technologies can aid the efforts made by government to more effectively interact with the public.  

The data, information, programs, and regulatory technologies discussed during the RegTech22 Data Summit are but a few examples of how efforts to increase transparency, accountability, and public trust in government are in motion. There is no one approach that will guarantee that public trust in government will improve, but the examples discussed throughout the day shed light on efforts across government and the private sector to improve data availability, usability, readability, and accessibility.  

The Data Coalition was founded 10 years ago with the mission to improve government transparency through the passage of the Digital Accountability and Transparency (DATA) Act, which became law in 2014. Since then, the Data Coalition Initiative and allies have advocated for data laws to change the way government tracks, manages, uses, and shares data. Reliable, quality, and accessible data feed the technologies discussed during the RegTech22 Data Summit. Learning how to work with the data produced, ingested, and circulated by the government remains an ongoing challenge and the Data Coalition Initiative will continue to  advocate for responsible policies to make government data high-quality, accessible, and usable.



Powered by Wild Apricot Membership Software