This week, Representatives Derek Kilmer (D-WA), William Timmons (R-SC), and Dean Phillips (D-MN) re-introduced a resolution (H.Con.Res.49) to establish a Commission on Evidence-Based Policymaking that would convene experts to review, analyze and make recommendations to Congress on how to better incorporate federal data and evidence-based policymaking in the legislative process.
This builds on the previous, successful bipartisan U.S. Commission on Evidence-Based Policymaking from 2016, which focused on data and evidence in the executive branch. That Commission resulted in unanimous, bipartisan recommendations that informed new laws, such as the Foundations for Evidence-Based Policymaking Act and the establishment of the National Secure Data Service demonstration project.
This resolution establishes a Congressional Commission on Evidence-Based Policymaking to review, analyze, and make recommendations to Congress to promote the use of Federal data for evidence-building and evidence-based policymaking. Overall, a commission can be an effective tool for improving data use in Congress by leveraging outside expertise, promoting cost effectiveness, and generating evidence-based policy recommendations that can lead to positive outcomes for the government and the public alike.
One of the main challenges in using data in Congress is obtaining access to relevant data. Often, data are not easily accessible or are siloed within various government agencies, making it difficult for lawmakers and their staff to access and use. Another challenge is ensuring the quality of the data being used. Data may be incomplete, inaccurate, or outdated, which can lead to incorrect conclusions or decisions.
Finally, using data effectively in Congress requires adequate funding and resources, including staffing and technology. Without these resources, it may be difficult for lawmakers and their staff to effectively access and utilize data to inform their decision-making.
A commission can play a critical role in helping Congress improve its use of data for evidence-building. By convening a group of outside experts and stakeholders, a commission can provide a neutral, independent forum for examining current data practices, identifying areas for improvement, and developing evidence-based policy recommendations.
A Wide Range of Expertise
Commissions can be composed of individuals with diverse backgrounds and perspectives, including data scientists, academics, industry leaders, and other stakeholders with expertise in data management, analysis, and utilization. This can help ensure that the commission's recommendations are grounded in the latest research and best practices, and are tailored to the unique needs and challenges of the congressional context.
Compared to other approaches to improving data use, such as hiring additional staff or building new data infrastructure, a commission can be a relatively low-cost and efficient way to leverage outside expertise and generate new ideas. Additionally, by providing a neutral, independent forum for exploring data issues, a commission can help identify ways to streamline existing data practices and reduce duplication and inefficiency.
Focus on the Legislative Branch
The Ryan- Murray Commission on Evidence-Based Policymaking focused on the need to provide evidence to the Executive Branch in order to inform their policymaking. The recommendations from that Commission resulted in the passage of the Foundations for Evidence-Based Policymaking Act, which in turn helped create the National Secure Data Service demonstration project. A new commission that focuses on the legislative branch can help create similar insights focusing on the specific needs of the Congressional policymakers.
Improving Trust in Congress
By expanding its capacity to make better informed decisions, Congress can allocate resources more effectively, improve the delivery of government services, and improve the efficiency of its own operations. Improving Congress’ capacity for using data to inform decisions will increase accountability and promote transparency – as such, this resolution can help build trust between Congress and the public, and foster a culture of evidence-based decision-making.
By Katie Howell
Artificial Intelligence has garnered much attention in recent months, including across the federal government. Executive orders and guidance over the past two Administrations signal bipartisan interest in developing and adopting AI technology at the federal level to ensure competitive advantage as well as mitigate certain issues with AI ethics and accountability.
The Data Foundation released “Eager for AI: Data Needs to Ensure Artificial Intelligence Readiness in the Federal Government,” a white paper developed in collaboration with the Data Coalition's Artificial Intelligence (AI) Task Force. Published in April, the paper is intended to serve as a resource for policymakers interested in AI oversight or funding needs, agencies looking to begin developing and implementing AI plans, or industry groups looking for ways to support the government.
High-quality, reliable, and accessible data feed AI technologies – it is critical that as agencies work to address the mounting pressure to realize the benefits of AI, they must first have a substantial amount of high-quality data, governed by effective principles, to deploy AI technologies and tools. Many federal agencies are taking steps to leverage AI technologies to improve internal processes and program operations, however, agencies’ strategies vary widely in maturity and alignment with their data strategies. “Eager for AI” consolidates existing strategies that federal agencies are employing that center around strong data management, governance, and technical capacity, and recommends opportunities for agencies and interagency bodies to support data needs for AI development and adoption.
The paper acknowledges key challenges that exist and identifies three key components of safe, effective adoption of AI: high-quality data, effective governance principles, and technical capacity. Numerous agencies are highlighted for their progress, with the aim to encourage collaboration and information sharing among agencies. However, agencies may lack transparency around how data are managed and used for AI implementation and the progress they are making in their own AI strategic plans. More transparency around certain data needs can similarly spark such collaboration, bringing in more perspectives to help solve challenges and help move agencies forward as they integrate AI into operations beyond pilot phases.
To facilitate collaboration and develop an approach that prioritizes data needs in federal agencies so that AI can be effective, equitable, and ethical, the Data Foundation presents various recommendations for agencies and cross-agency bodies, such as the Chief Data Officers Council. Among the recommendations, the paper suggests agencies should evaluate “AI readiness” by identifying specific steps, responsible oversight bodies, and performance metrics in AI plans; maintain an inventory of implementation progress on a single, publicly accessible website; and continue implementation of the Foundations for Evidence-Based Policymaking Act. The Chief Data Officers Council can help coordinate agencies’ AI planning and implementation efforts by publishing guidance on data standards and management practices that facilitate agencies’ ability to adopt AI technologies and developing a government-wide AI workforce training program, among other things.
Use of AI has the potential to reduce cost, ease compliance burdens, and improve effectiveness of operations and service delivery, and while agencies need not wait until their data strategies are fully implemented before considering additional aspects that are needed to prepare for AI adoption on a larger scale, data management and governance must be prioritized. By highlighting agencies that are taking encouraging steps to ensure data readiness, “Eager for AI” aims to bring data to the forefront to ensure the federal government is able to leverage the exciting potential of these emerging technologies.
Read the Report
Today, Representative Derek Kilmer introduced a resolution to establish a congressional Commission on Evidence-Based Policymaking. Representative Kilmer, who chairs the House Select Committee on the Modernization of Congress, with support from ranking member William Timmons and fellow Committee members David Joyce and Dean Phillips, is proposing the Commission to “review, analyze, and make commendations to Congress to in an effort to better incorporate federal data and evidence-based policymaking throughout the legislative process.”
The resolution proposes a congressional Commission on Evidence-Based Policymaking modeled after the previous, successful bipartisan U.S. Commission on Evidence-Based Policymaking, which focused on data and evidence in the executive branch. This bipartisan resolution is just one of the many positive changes that the Select Committee has advanced since it was formed in January 2019. The Data Foundation’s President Nick Hart testified at a hearing last October, providing recommendations on how to strengthen the lawmaking process by using data and evidence.
Read the full testimony.
The executive-branch focused Commission studied the challenges in our country’s data infrastructure as well as potential research and evaluation capabilities, issuing a unanimous set of findings and recommendations for improving agencies’ operations in 2017. As a result of that Commission, Congress passed the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), which created Chief Data Officers and the presumption of accessibility for federal data sets, among other innovative data provisions. It also paved the way for the National Secure Data Service, which was recently authorized in the CHIPS and Science Act of 2022.
The new effort will focus on building Congress’s ability to incorporate data and evidence into the legislative process, including exploring the Data Foundation’s recommendations to create a congressional Chief Data Officer, improve data expertise in Congress, and explore methods for supporting and implementing data-centered legislation such as the Evidence Act.
This resolution is an important step towards improving Congressional capacity to use data and evidence in the legislative process. Additionally, this thoughtful, deliberate approach to increasing evidence use can help establish greater trust between the American people and their government.
The Data Coalition Initiative looks forward to working with Congress on this important effort.
This week, the Data Coalition and other transparency groups sent a letter to the Hill outlining the benefits of the adoption of financial data standards, as required by the Financial Data Transparency Act (S. 4295).
This legislation, and its bipartisan companion bill, the Financial Transparency Act (H.R. 2989), which passed the House of Representatives earlier this Congress with 400 votes, will enable policymakers and the American public to have access to reliable information about financial markets, building on recent data reforms advanced in Congress like the Digital Accountability and Transparency Act (P.L. 113-101) and the Foundations for Evidence-Based Policymaking Act (P.L. 115-435).
The FDTA requires seven of the financial regulatory member-agencies of the U.S. Financial Stability Oversight Council to adopt and apply uniform data standards, including a non-proprietary legal entity identifier, for the regulatory information they collect. This will provide regulatory information that is searchable and machine-readable. By adopting the standards in this bill, reporting processes can be streamlines for business, reducing long-term regulatory and compliance costs. Over time, regulated entities will benefit from more efficient regulatory compliance that stems from better information processing, software-enabled filing preparation, and data reconciliation.
The required data standards will also reduce errors in information reported to financial regulators. Improved data accuracy means regulatory oversight will be more efficient, increasing the number of entities regulators can meaningfully review in efforts to prevent fraud, rather than focusing on data quality issues. This will provide financial regulatory agencies more complete, real-time insights about regulated entities’ exposures and relationships to one another. Such information will prove invaluable during times of stress and market uncertainty, allowing regulators to more efficiently contain that volatility, preventing spill-over market events that may be detrimental to the American or global economy.
The benefits of applying data standards to financial regulatory information are clear. Open data standards can accelerate policy innovation while improving transparency and accountability for federal agencies and the public. We urge its inclusion in the NDAA so taxpayers no longer have to wait on better financial data.
Anti-Corruption Data Collective (ACDC)
Center for Open Data Enterprise
Institute for Policy Studies - Inequality Program
Open The Government
Project on Government Oversight (POGO)
Results for America
The Digital Democracy Project
Transparency International U.S.
Truth in Accounting, Daniels College of Business, University of Denver
Across the government, policymakers are realizing what financial markets have known for decades: data are vital. Yet many essential government functions rely on outdated, paper-based processes—including financial regulation. Laws like the Digital Accountability and Transparency Act of 2014, the Foundations for Evidence-Based Policymaking Act of 2018, and the Grant Reporting Efficiency and Agreements Transparency Act of 2019 demonstrate recent moves by policymakers to address these existing, outdated processes. What all these bills have in common is that they ask that the information the government already collects be standardized. Data standardization means better methods of reporting to regulators and capital markets. It also enables the use of automation to improve analytics, reduce inefficiency, and enable better transparency.
The Financial Data Transparency Act (FDTA) (S. 4295) continues the movement towards standardized data by asking financial regulators to adopt industry-accepted consensus data standards, transforming the information they collect into an accessible machine-readable format. In fact, the FDTA was drafted in response to recommendations provided by the U.S. Treasury Department in a 2017 report regarding reduction of regulatory overlap and duplication for banks and credit unions. The report, in response to an Executive Order on principles for regulating the financial system (E.O. 13772), calls for improved data sharing and reductions in reporting burdens and duplication.
The Federal Deposit Insurance Corporation (FDIC) was the first federal financial regulator to implement a large-scale modernization project, that is, move from documents and tables of numbers to machine-readable semantic data.
The research about the FDIC affirms statistically relevant increases in efficiency and improvements in data quality. This project is more than a decade old, evidence that mature technology exists to meet the disclosure modernization requirements of the FDTA.
The findings speak for themselves: mathematical errors used to plague 33% of the data points in the FDIC Bank Call Reports. Now, they are nearly entirely eliminated before submission. Specifically:
Statistical correctness in the data moved from 66% to 100% correct.
Data is available to capital markets within hours rather than days (or weeks if corrections were necessary).
Analysts completed their workload in 15% less time and thus were able to increase the number of cases they could close.
What is important to note about the FDIC’s approach to data modernization is that they achieved all of these gains without changing the data they were collecting; rather, they mapped a way for the data they were already collecting to be machine-readable. The result is that they reduced the need to ask for corrections—companies don't have to be burdened with making and resubmitting corrections, and investors don't have to wait to get access to trustworthy data.
The FDIC is just one of the financial regulators around the world who are modernizing their reporting regimes by moving away from document-based reporting and adopting structured data formats and standardized data fields for the information they already collect instead.
The Municipal Securities Rulemaking Board (MSRB) is also working to modernize its data systems, with an explicit reference in its new quadrennial Strategic Plan to “support market-led initiatives to establish uniform data … standards in the municipal securities market that facilitate better disclosure and analysis of market information.” The FDTA helps MSRB establish and advance those standards.
The FDTA asks that additional federal regulators similarly modernize and make the information they already collect, or wish to collect in the future, available as machine-readable data using non-proprietary or royalty-free methods for encoding data. By passing this bill, Congress can ensure that this important modernization project will be a priority for regulators.
Importantly, the FDTA does not ask regulators to change any of the information they collect or publish, nor does it alter any statutory authority of the regulators to decide what information they collect. Rather, this legislation asks that appropriate, flexible data standards be developed based on the already-existing regulatory and reporting standards.
Data modernization is a common sense solution to many challenges for our financial regulators, but it is not something that can be done overnight. That is why the FDTA provides flexibility to agencies as they implement the provisions of the bill. The U.S. Treasury and the other agencies have two years from the date of enactment to develop and publish the required data standards through a joint rule. Covered agencies then have two years to implement the data standards into their respective regulatory compliance reporting, giving them a total of four years.
Further, in order to reduce the burden on smaller regulated entities, the FDTA allows agencies to scale data reporting requirements in order to reduce unjustified burden on smaller regulated entities and minimize disruptive changes to those affected by regulations. What’s more, the one-time upfront costs are likely to be offset in the longer term by advances enabled by new technologies.
By law, regulated entities must produce data for regulators. But they can also benefit by using the public, regulatory data. Adopting standardized data can reduce the cost of both these activities by transforming an opaque, hard to understand system into a transparent, level playing field. In 2018, the Association of International Certified Professional Accountants found that it cost roughly 70% of companies $5,500 or less to comply with a Securities and Exchange Commission (SEC) rule which required public companies to report their financials in the XBRL (eXtensible Business Reporting Language) data standard format. This process of standardization makes market-critical data available as machine readable data. With these data standards, stakeholders, like market lenders, borrowers and small investors, can start using off-the-shelf technology solutions, meaning that small entities can compete with larger firms for data insights.
As the data reported into the regulators becomes standardized, small entities without the resources for massive data analytics shops can focus on how to use the data for maximum benefit rather than ingesting data sets on ad hoc basis, allowing them to participate more strategically in the market.
The Financial Data Transparency Act establishes a framework that can be used to improve regulatory reporting efficiency in coming years and decades, reducing compliance overhead, and the level of effort required for submitting financial reports. It also sets the stage for financial regulators to have access to higher quality data so regulators can spend their time focused on enforcement rather than tracking down inadvertent errors in reports. Streamlining regulatory reporting frees up valuable time and energy that can support private sector innovation and productivity growth. This is a win for capital markets, for reporting entities, and for data transparency.
We are proud to announce the recipients of the Data Coalition Initiative’s 2022 Datum Awards.
This year's recipients are awarded a Datum Award for their role as innovators, leaders, and champions in the use of data in evidence-based decision-making, transparency, and accountability, to better our government and society.
Nominees are individuals who are champions for the use of data in evidence-based decision-making, transparency, and accountability. Each of the nominated champions support and demonstrate a commitment to enabling a more efficient and effective government, while using data to make the government more innovative for improving outcomes that improve the lives of the American people, strengthen the economy, or make the country a better place.
With cybercrime continuing to harm vulnerable Americans, U.S. Senator Brian Schatz authored and ushered the Better Cybercrime Metrics Act of 2022 to passage. The bipartisan bill will give law enforcement a clearer picture of online crimes by improving data collection and requiring the FBI to integrate cybercrime incidents into its current reporting streams. Through Senator Schatz’s leadership, law enforcement and policymakers will have wider access to the data and tools they need to keep Americans safe.
Chief Data Officer, U.S. Equal Employment Opportunity Commission
Samuel C. “Chris” Haffer, Ph.D. became the U.S. Equal Employment Opportunity Commission’s first Chief Data Officer in 2017 and immediately set to work modernizing the agency’s data practices. Among the many projects Dr. Haffer has led in the past year, the EEOC has launched EEOC Explore, an accessible data query and mapping tool, run nationwide surveys of employers and workers during the pandemic, and created the District Demographic Dashboard to help identify and better serve vulnerable workers.
Associate Director, Division of Economic and Risk Analysis, U.S. Securities and Exchange Commission
Mike Willis is a longtime advocate for evidence-based decision-making, transparency, and accountability in the financial sector. As a leader in the U.S. Securities and Exchange Commission’s Division of Economic and Risk Analysis, Mr. Willis is leading efforts to require machine-readable financial disclosures, and oversaw the addition of downloadable, freely available data sets about securities holdings of large institutional investors to the SEC’s website.
Deputy Chief Veterans Experience Officer, U.S. Department of Veterans Affairs
Barbara Morton has overseen drastic changes in how U.S. veterans are served in her time with the U.S. Department of Veterans Affairs. She has built durable customer experience (CX) capabilities and capacities through her work. Thanks to her exemplary achievements, she is now responsible for sharing best practices in CX across federal agencies. Ms. Morton contributes to the Customer Experience Cookbook, which serves as a practitioner’s guide to building and sustaining customer experience capabilities in government.
Chief Evaluation Officer, U.S. Department Education
As Chief Evaluation Officer of the U.S. Department of Education, applies extensive knowledge of mixed methods research to education evaluation, conducting rigorous tests to help states and local governments improve student outcomes. Dr. Soldner and his team have contributed important research on data sharing and the impacts of federally funded learning programs, and supported the What Works Clearinghouse, a resource for educators, evaluators, and researchers everywhere.
Senior Statistician, White House Office of Management and Budget
Before becoming Senior Statistician at the White House Office of Management and Budget, Shelly Wilkie Martinez served as Executive Director of the U.S. Commission on Evidence-Based Policymaking, supported the Federal Advisory Committee on Data for Evidence Building, and much more. With a deep knowledge of the federal data ecosystem, Ms. Martinez has worked tirelessly in her role with OMB to fulfill the vision of the Evidence Commission and support compliance with the Evidence Act.
Thank you to our sponsors
The American public relies on good information to make decisions about what products to buy at the grocery store, for purchasing or renting a home, and for understanding what characteristics of a car meet individual needs. Investment decisions on Wall Street are no different – reliable and clear information is needed in the marketplace to make good financial decisions. Government too is no exception - reliable, high quality financial information allows decision makers to develop and executive on sound policies that benefit the economy.
Today, the ability for companies, government, and the American people to access data about regulated firms across the country is limited by an arcane regulatory reporting infrastructure. The Financial Data Transparency Act (FDTA) co-sponsored by Senators Mark Warner (D-VA), Mike Crapo (R-ID), and Chuck Grassley (R-IA) takes on this problem by improving the reporting infrastructure for regulated firms to improve accountability. This bill has clear benefits for the American public, investors, government regulators, and private sector firms.
Why do we need to modernize our financial data?
These bills respond to recommendations provided by the U.S. Treasury Department in a 2017 report regarding reduction of regulatory overlap and duplication for banks and credit unions. The report, in response to an Executive Order on principles for regulating the financial system (E.O. 13772), calls for improved data sharing and reductions in reporting burdens and duplication.
The Financial Data Transparency Act addresses longstanding data deficiencies in regulatory reporting. The bill would require the seven of the financial regulatory member-agencies of the U.S. Financial Stability Oversight Council to adopt and apply uniform data standards (i.e. a common data format) for the information collected from regulated entities. As a consequence, the data standards will enable better information processing, software-enabled filing preparation, and data reconciliation. These features collectively are the basis for retail investors, regulators, and the market having better information for selecting investment opportunities and understanding risks.
The Financial Data Transparency Act establishes a framework that can be used to improve regulatory reporting efficiency in coming years, reducing compliance overhead and the level of effort required for submitting financial reports. It also sets the stage for financial regulators to have access to higher quality data so they can spend their time focused on enforcement rather than tracking down inadvertent errors in reports. Streamlining regulatory reporting frees up valuable time and energy that can also support private sector innovation and productivity growth.
The FDTA reiterates the requirement for disclosable public data assets to be made available as open Government Data assets, per the Foundations for Evidence-Based Policymaking Act. This assures the data assets published under the regulatory authorities of the FDTA’s covered agencies are presented in a manner consistent with existing government-wide data policy.
Open data standards allow for better information processing, software-enabled filing preparation, and data reconciliation. These features collectively are the basis for retail investors, regulators, and the market having better information for selecting investment opportunities and understanding risks. The FDTA includes a set of required characteristics which builds upon industry and technology best practices, accounts for lessons learned from existing federal regulatory standard setting, and incorporates relevant federal policy and international standards definitions.
The data will be made available under “open license” format which will reduce barriers for industry, academia, and others to incorporate or reuse the data standards and information definitions into systems and processes. This requirement will also facilitate competition among multiple vendors for creation, data collection, and analysis tools, which reduce long-term costs.
Does requiring open data standards mean there will be universal reporting requirements?
No. The FDTA does not impose new data collection or reporting requirements, and does not require agencies or regulated entities to collect or make publicly available additional information.
Will FDTA’s requirements mean that regulated entities need to adopt a specific software or other technology?
No. Open data standards can actually help facilitate technological innovation by reducing barriers for industry, academia and others to create new tools, such as artificial intelligence and machine learning applications. FDTA does not impose any technological mandates and regulated entities would continue to independently choose their software and technological solutions.
Will the FDTA requirements be difficult for regulated entities to adopt?
In order to reduce the burden on smaller regulated entities, the FDTA provides agencies with the flexibility to scale data reporting requirements in order to reduce burdens on smaller regulated entities and minimize disruptive changes to those affected by regulations. The data standards required by FDTA leverage existing, industry-accepted data formats and definitional standards. The standards connect with existing accounting standards to allow regulated entities to leverage expertise and processes established by the accounting, audit, legal, and regulatory compliance workforce, without imposing undue new burdens or costs.
The Financial Transparency Act (H.R. 2989), cosponsored by Reps. Carolyn Maloney (D-NY) and Patrick McHenry (R-NC), is substantively very similar to the FDTA with common requirements, scopes, goals, and timelines. The bills differ in the process used to implement rules, with the FDTA establishing a joint rulemaking process for improved coordination across federal financial regulators. This coordinated process reduces the burden on regulated entities participating in regulatory processes for establishing standards and enables greater coordination – at the expense of the government – for engaging with stakeholders and responding to concerns that may be raised in future standard-setting activities.
The House-proposed FTA passed the U.S. House of Representatives with a strong bipartisan vote of 400-19 on October 25, 2021. It passed the House a second time in 2022 as part of the National Defense Authorization Act.
Representing members from the data analytics, technology, and management fields, the Data Coalition endorsed the current version of the Financial Data Transparency Act in the 117th Congress. The Data Coalition also previously endorsed the related Financial Transparency Act in the 114th, 115th, 116th, and 117th Congresses. These bipartisan pieces of legislation address the significant underlying data challenges that contribute to burdensome and ineffective financial regulation.
Other transparency organizations and financial firms quickly endorsed versions of legislation:
Donnelley Financial Solutions (DFIN)
Government Information Watch
Object Management Group (OMG)
Last edited September 12, 2022
The White House Equitable Data Working Group (EDWG) released A Vision for Equitable Data in April 2022. The report outlines the recommendations of the Working Group established by Executive Order 13985 in January 2021. The Working Group was tasked with identifying inadequacies and areas of improvement within federal data related to equity, and outlining a strategy for increasing data available for measuring equity and representing the diversity of the American people and their experiences.
The report is an important step toward ensuring equity is at the forefront of government policies and programs. Three priority uses for equitable data were presented in the report:
Generating disaggregated statistical estimates to characterize experiences of historically underserved groups,
increasing access to disaggregated data for the evidence-building, and
conducting equity assessments of federal programs.
The Data Coalition supports these priorities, along with the report findings and recommendations – particularly the emphasis on the collection and disaggregation of demographic data as well as the suggestion that agencies work with federal statistical agencies to incorporate and protect demographic data. In July 2022, the Data Coalition Initiative met with staff from the House Oversight and Government Reform Committee (HOGR) at their request to discuss the EDWG report, gaps in the report, and potential areas for Congressional and Administrative support.
There are opportunities to further strengthen an environment that will support a more equitable data system. Data Coalition members used the time with the HOGR Committee to offer insight into certain aspects of improving data quality and designing an equitable and inclusive process, from designing data collection plans to ensuring data are useful for end users across communities. Data Coalition members’ comments fell into four general categories: fostering trust by inclusive engagement, developing and using data standards, increasing accessibility, and bolstering accountability.
Opportunities to Strengthen the EDWG Report
Beyond engaging all levels of government and the research community as highlighted in the report, it is crucial to have meaningful stakeholder engagement, especially with those communities affected by data collecting. Meaningful engagement fosters a sense of trust as well as prevents data collection and use causing unanticipated harm to people. This trust is essential to ensuring that the data collected from individuals and communities are high quality and relevant to policies that aim to help those same communities.
Data standards are also necessary to ensure a more aligned, equitable federal data ecosystem. Developing consensus standards needs to be done in collaboration with communities of practice, data contributors, and potentially impacted communities. Though agencies may have the authority to identify preferred standards, they lack the authority to adopt standards. Adopting consensus standards among agencies can provide more clarity for those collecting and using government data. Similarly, consensus data standards can facilitate data sharing amongst agencies – reducing burden on both government and taxpayers.
The report discusses the need for increasing accessibility by making data more understandable and useful. To address this, the government can bolster human capital to support data capacity on the government level and address capacity to use data in order to engage those contributing data. Additionally, agencies and other stakeholders should demonstrate the value of data in addressing underserved community needs.
Provision of data access tools and developing usable online data portals are ways to tackle accessibility as well as enhance transparency and accountability. The EDWG approaches accountability and transparency from a taxpayer lens – but in terms of equitable data use and impact, there needs to also be a requirement to tell the story of the data impact, putting more emphasis on the benefit for a community for using this data, in turn incentivizing continued data contribution.
Legislative & Administrative Recommendations
With this in mind, the Data Coalition offers the following recommendations for how to further the EDWG’s efforts:
Leverage existing provisions from the Foundations for Evidence-Based Policymaking Act of 2018 to improve collection, management, and use of data
Pass the National Secure Data Service Act (H.R. 3133)
Fund federal, state, and local government to adopt and modernize data systems
Develop legislation to adopt consensus-based data standards, including requiring collection of data that informs agency equity assessments using a uniform standard that can be adopted at all levels of government
Build additional funding flexibilities into the grantmaking process to enable agencies to direct grants toward building capacity without needing additional funds
The EDWG report includes critical steps to make data more equitable and the HOGR Committee’s interest in gathering stakeholder input to understand ways to bolster the report’s recommendations is encouraging. In addition to Congressional and Administrative opportunities, continuing to engage in discussions with data and equity experts as well as the communities providing and using the data, and leveraging existing authorities and expertise with the government will all contribute to facilitating a more equitable data system.
President Biden signed the bipartisan Courthouse Ethics and Transparency Act into law (P.L. 117-125) in May 2022. The new law requires that all judicial financial disclosures be "full-text searchable, sortable, and downloadable format for access by the public." The law aims to make court financial data open and accessible to the public, enhancing transparency and accountability in the Judicial branch of the federal government.
Federal judges are required to recuse themselves from cases where there may be a financial conflict of interest, including if family members have a financial interest in a case. However, the current oversight process to ensure judges are taking the necessary steps to be impartial is lengthy and complicated.
Currently, federal judges are subject to publicly reporting securities transactions – such as the purchase or sale of stocks, bonds, commodities futures, and other forms of securities – however, these reports have only a six-year lifespan before they can be destroyed and are difficult for the public to access. Even if an interested party requested and collected judicial disclosure reports, they may not be useful if the data are not searchable.
The Courthouse Ethics and Transparency Act, first introduced in the 117th Congress in October 2021 by Senator John Cornyn (R-TX), lays out two main provisions to address the current oversight system. First, federal judicial officials must file a report for securities transactions over $1,000 within 45 days, in line with requirements for top officials in the Executive and Legislative branches of government, including the President, Members of Congress, and President- and Senate-appointed officials. Second, the U.S. Office of the Courts must make these disclosures publicly available on a searchable internet database no more than 90 days after the report is filed.
The establishment and maintenance of a searchable, sortable, and downloadable format database is key to removing the unnecessary limitations to accessibility of statutorily-mandated financial disclosures, facilitating a collection of data that are useful and timely. By having this data more easily accessible and available for oversight, the Courthouse Ethics and Transparency Act can help strengthen public confidence in the integrity of the federal judicial system and improve overall trust in the government as a whole.
The Data Coalition Initiative advocates for policies that ensure government data are high-quality, accessible, and usable, and we applaud all those who have been working to advance the The Courthouse Ethics and Transparency Act. This is another important step toward all branches of the federal government utilizing data that can ensure they operate in a transparent and effective manner.
After a near two-year pause on in-person convenings, the Data Coalition Initiative hosted the AI Public Forum x RegTech22 Data Summit, sponsored by Donnelley Financial Solutions (DFIN) , as a hybrid event, with participants both in Washington D.C. and online. The event aimed to facilitate an environment to build a strong national data community and advocates for responsible policies to make government data high-quality, accessible, and usable – a central goal of the Data Coalition.
The day began with a public forum on “Accelerating AI in the Public Sector,” where over 20 representatives from industry, academia, non-profits, state and local governments, and the general public shared perspectives and recommendations for how to best use AI in public sector regulation. The afternoon program, RegTech22 Data Summit, featured discussions focused on the intersection of regulatory technology (RegTech) and improving customer experience (CX).
The RegTech22 Data Summit was designed with two goals in mind:
The Summit offered a wide range of experts from agencies and organizations, all representing different “customers” with varying missions and needs. Innovative solutions, such as data visualizations and collaborative databases to standardization of data and entity identifiers, demonstrated how data needs, information access and dissemination, customers, and collaboration supported the idea that there are solutions to celebrate and clear avenues where agencies can refine efforts.
The Summit program kicked off with a panel featuring Chief Data Officer (CDO) from the Commodity Futures Trading Commission (CFTC), Tammy Roust, illustrated the breadth of the commission’s customers by noting that we cannot buy a box of cereal without interacting with the work of the CFTC. Ultimately, she noted, “we serve [American taxpayers] by ensuring that the futures and derivatives markets are fair and transparent and have integrity.” Responding to the White House executive order to prioritize CX, the CFTC is building data visualization tools and open data portals under the guidance of the CDO office, making publicly available data more accessible while maintaining data privacies that are specific to the commission’s data feeds.
Considering different “customers,” co-panelist Adam Scott, Director of Design and Development at the Consumer Financial Protection Bureau (CFPB), worked with the bureau’s Chief Technologist, Erie Meyer, to identify what information might be most valuable for the general public who visited the CFPB website. Maximizing response time, Adam and Erie engaged with design sprints. In those sprints they looked at broad problems, worked with internal experts to understand the problem space, hypothesize solutions, design prototypes, and test with real users in a short timeframe. The new approach allows the CFPB to respond rapidly, and as Adam summarized, “mak[e] sure that we’re constantly evolving and updating our site to meet the most critical needs of our users.” Highlighting how data access is also important for effective CX, Adam discussed how his team maintains the CFPB public data inventory. After identifying a distinct pain point that users did not have a means to discover all of the CFPB’s public data sets, CFPB built the simple webpage.
In the Summit’s second panel on improving climate data, CDO of the Federal Energy Regulatory Commission (FERC) Kirsten Dalboe emphasized the role of agency CDOs as the ones responsible for overseeing proper data governance and use throughout the data lifecycle. “Effective data governance and a managed data lifecycle designates trusted data sources. It maintains a comprehensive data inventory. It defines data standards, determines policies and strategies for promoting enterprise data management activities, promotes data stewardship to facilitate data sharing, collaboration, and data quality,” explained Dalboe. Proper data governance allows agencies and organizations to know what they have and how it can be used to carry out their missions.
Being able to correctly identify data needs, uses, and gaps are fundamental to building and sustaining the proper use of data, whether an agency or organization interfaces with the general public, regulators, academics, international organizations, investors, or utilities. Panelist Kristy Howell, Senior Economist at the International Monetary Fund (IMF), discussed the G20 Data Gaps Initiative, an IMF project that started after the global financial crisis with the goal to identify data gaps and how the IMF can work with the G20 economies to improve data. According to Howell, the data gaps project is expanding to include reviews on climate impact and macroeconomic data. Additionally, the IMF developed the Climate Change Indicators Dashboard, a truly collaborating database created to address the growing need for more information on climate and to assess how climate impacts the macro economy and how the economy is impacting the environment. As Howell and SAP Vice President and Head of U.S. Government Relations Kevin Richards stated multiple times – “you can’t manage what you can’t measure.”
In a theme seen throughout the Summit, Howell continued, “data access and data sharing are important pillars” to collaborative problem solving efforts. Similar to remarks made by Dalboe, a common understanding of data elements and vocabulary – data standards – can help drive innovations and will support data-informed decision making. “Standards facilitate automation and streamlining of processes, which results in lower costs and more effective analysis,” said Mike Willis, Chairperson of the Regulatory Oversight Council (ROC) and Associate Director in Division of Economic and Risk Analysis at the Securities and Exchange Commission (SEC).
Highlighting one area of improvement for data standards, Willis noted that federal agencies have over 50 legal entity identifiers unique to the agency. According to Willis, a standardized identifier like the Legal Entity Identifier (LEI), derivative identifier, and the Financial Instrument Global Identifier (FIGI), used for financial instruments, could connect information across regulatory agencies and international borders, simplifying compliance processes, bolstering Know-Your-Customer (KYC) requirements, and even enhance Zero Trust Architecture, which uses digital identities. In the case of Zero Trust Architecture, a structure required for federal agencies to adopt following an executive order on cybersecurity issued in 2021, the LEI offers that international standardized unambiguous identifier for legal entities issuing digital credentials – validating the digital credential and expanding traceability. Resulting in reduced costs and compliance burdens, the regulators and regulated entities would benefit from the use of standardized data and standardized entity identifiers.
Expounding on how government is using technology to improve CX, Hugh Halpern, Director of the Government Publishing Office (GPO), began his keynote address by reminding the audience that GPO is responsible for producing and providing print and publishing services to all three branches of government, manufacturing everything from the federal register, congressional record, congressional bills, U.S. passport, and much more. Focusing on congressional documents and the GPO’s design evolution over the past 200 years, Director Halpern explained how the technology morphed from handset type to digital text editing, and while the design and accessibility of the documents adapted with available innovations they were not always user friendly. The introduction of USLM XML schema laid the digital foundation for exponential improvements of legislative documents. USLM feeds XML Publishing (XPub) and allows legislators and staff to more seamlessly track changes, compare document versions, and use the document for additional content development, “just like a normal text file!” While there are still areas of improvement, solutions like USLM have the potential to digest and smooth out issues related to unstructured data and text files that confuse and gum up communication lines across the legislative branch. Just like the work of the GPO, the future use cases for such technologies are not limited to one branch of government.
The White House executive order on CX – Transforming Federal Customer Experience and Service to Rebuild Trust in Government – prioritizes how agencies engage with the American people, including lines of feedback. Speaking on the panel topic of “Improving Citizen Engagement and Transparency in Rulemaking Using RegTech Solutions,” Reeve Bull, Research Director at the Administrative Conference of the United States (ACUS); Virginia Huth, Deputy Assistant Commissioner, Federal Acquisition Service Office of Technology Transformation, General Services Administration (GSA); Kirsten Gullikson, Director, Systems Analysis and Quality Assurance at Office of the Clerk, U.S. House of Representatives; and Susan Dudley, Former Administrator of the Office of Information and Regulatory Affairs, White House Office of Management and Budget, each emphasized the importance of engaging with the general public with an effort to support transparency and to expand trust.
The panelists explained how technology has the potential to help with this, but in regard to the rulemaking process, it must be applied in the right context. Dudley and Bull looked at potential areas of improvement in the e-rulemaking process, including the application of agile methods that would incorporate constant feedback loops for improvement. While not yet widely used, this approach has potential to increase public engagement and improve rollouts. To elaborate further, Huth offered a few examples of innovations she managed at GSA’s eRulemaking Program Management Office, referring to a proof of concept that combines machine-readable language, no code software, and natural language processing. Bringing together the “trifecta,” Huth explained, allowed GSA to build a technological approach that “enables machine-readable tasks that give us the ability to provide context,” – something that is particularly useful when helping regulators sift through public comments and review regulations, identifying out of date topics and contradictory regulations.
Changes in government processes, data governance, and the strategic use of technologies will take time. As CFTC CDO Tammy Roust remarked, the fact that government does not match private sector pace of change is not necessarily a bad thing. Fellow panelist Darren Wray, Co-founder of Guardum, a DFIN company, expanded on the point by noting that if something breaks in government transformation, the consequences can be catastrophic. Nonetheless, private-public collaborations and conversations like the ones hosted during the RegTech22 Data Summit provide a space for the exchange of ideas and perhaps insight into how available management approaches and technologies can aid the efforts made by government to more effectively interact with the public.
The data, information, programs, and regulatory technologies discussed during the RegTech22 Data Summit are but a few examples of how efforts to increase transparency, accountability, and public trust in government are in motion. There is no one approach that will guarantee that public trust in government will improve, but the examples discussed throughout the day shed light on efforts across government and the private sector to improve data availability, usability, readability, and accessibility.
The Data Coalition was founded 10 years ago with the mission to improve government transparency through the passage of the Digital Accountability and Transparency (DATA) Act, which became law in 2014. Since then, the Data Coalition Initiative and allies have advocated for data laws to change the way government tracks, manages, uses, and shares data. Reliable, quality, and accessible data feed the technologies discussed during the RegTech22 Data Summit. Learning how to work with the data produced, ingested, and circulated by the government remains an ongoing challenge and the Data Coalition Initiative will continue to advocate for responsible policies to make government data high-quality, accessible, and usable.
1100 13TH STREET NORTHWEST SUITE 800WASHINGTON, DC, 20005, UNITED STATESINFO@DATAFOUNDATION.ORG
RETURN TO DATA FOUNDATION