Log in


Log in


  • January 21, 2022 9:00 AM | Anonymous member (Administrator)

    The Data Coalition’s Environmental, Social, and Governance (ESG) Data Working Group released its final report this week sharing recommendations to help bring ESG data to a place it could be effectively used in a regulatory environment. The group calls for ESG data to be standardized, machine-readable, and auditable. If the U.S. Federal government was to implement the working group’s recommendations, it would be positioned to take the lead on developing standards and expectations before international standards are in place. 

    ESG data in an entity’s financial reports are used by stakeholders to assess risk and long-term value on metrics impacting governance, people, profits, and the planet, increasingly influencing international investment decisions. Currently, ESG reporting is largely voluntary and reporting requirements are determined by investment firms and the market, creating an environment of confusing and uncomparable ESG disclosure. According to the working group, this means that “understanding the information provided across firms, sectors, regions, or countries is not only challenging, but largely unreliable and uncertain.”  

    As calls for ESG reporting grow in the U.S. and internationally, the Data Coalition recognized an opportunity to support useful ESG reporting. ​​The Data Coalition ESG Working Group convened in 2021 and included representatives from member companies DFIN, Workiva, Summit LLC, the Global LEI Foundation, SAP, XBRL US, and Data Foundation staff. With the goal of producing guidance to inform data-related legislation, financial regulations, and private sector practices, the group came together throughout the year to develop an understanding of the current landscape of ESG reporting and determine the most critical improvements needed. In June 2021, the group provided public feedback on a climate disclosure inquiry from the Securities and Exchange Commission and released their final report in December of 2021. 

    • First, the U.S. should contribute to consensus international, domain standards to the extent feasible. Existing independent standard-setting organizations that take into account national, regional, sector, and market perspectives provide a platform for producing standards that are usable and consistent with global activities. 
    • Second, all ESG data disclosures should be machine-readable and digital so that it facilitates interoperability. Machine-readable data allows more efficient and cost-effective growth across the industry. 
    • Lastly, ESG data should lend itself to be auditable to support greater accountability and transparency. A glossary of terms and data sources, and establishing processes similar to disclosure controls with financial reporting, can increase the confidence of investors and the public that ESG data is trustworthy and of high quality.

    Market pressure for ESG disclosure is increasing and the U.S. should take advantage of the momentum of interest in ESG. The Data Coalition supports the working group’s recommendations that U.S. regulators, by cultivating a data ecosystem composed of standardized, machine-readable, and auditable data, can facilitate a more trustworthy and influential ESG reporting ecosystem. 

    Download the full recommendations 

  • December 02, 2021 9:00 AM | Anonymous member (Administrator)

    The former Co-Chairs of the U.S. Commission on Evidence-Based Policymaking Katharine G. Abraham and Ron Haskins former Co-Chairs of the Commission on Evidence-Based Policymaking wrote to the U.S. Senate Committee on Commerce, Science, and Transportation and U.S. House Representatives Committee on Science, Space, and Technology to strongly encourage the respective committees to include the NSDS Act in the conferenced version of the United States Innovation and Competition Act (USICA).

    We are the former Co-Chairs of the U.S. Commission on Evidence-Based Policymaking, appointed respectively by then-President Barack Obama and then-House Speaker Paul Ryan. The Commission members included fifteen politically-appointed data and privacy experts. Our report, released in September 2017, contained a set of findings and recommendations endorsed by all of the Commission’s members for more effectively leveraging existing government data assets while also enhancing privacy protections. [1]

    In 2018, Congress acted to implement about half of the Commission’s recommendations by passing the Foundations for Evidence-Based Policymaking Act (P.L. 115-435). The experience since that time has made clear  the benefits for government transparency and accountability of implementing the Evidence Act’s provisions.  Thanks to newly appointed Chief Data Officers and the increased emphasis on program evaluation as a core function of government, among other changes, the government has made significant progress towards more effectively using the data it holds. 

    While the Evidence Act put many of our recommendations into law, other important recommendations remain for the Congress to address. One that we believe is a critical priority is the establishment of a National Secure Data Service (NSDS). Earlier this year, the House of Representatives approved the bipartisan NSDS Act as part of the National Science Foundation reauthorization. As the former leaders of the Evidence Commission, we endorse the NSDS Act as a means to address one of our most important recommendations, establishing a secure data linkage resource that can facilitate addressing broad policy and research questions while protecting privacy and confidentiality. We strongly encourage Congress to include the NSDS Act in the conferenced version of the United States Innovation and Competition Act (USICA).

    The NSDS Act was based on the Evidence Commission’s 2017 report, research from the National Academy of Sciences, and subsequent research carried out by a Commission member and senior staff member[2]. Many of the original Commissioners also provided direct feedback on and agreed with that subsequent proposal. In May 2021, NSF’s Assistant Director for Social, Behavioral, and Economic Sciences said in a public statement that NSF is already beginning efforts to establish the infrastructure to operate the data service [3]. More recently, the Federal Advisory Committee on Data for Evidence Building established by the Evidence Act released its interim recommendations calling for the creation of a data service and presented an implementation framework that aligns with the NSDS Act.

    The establishment of an NSDS would substantially contribute to strengthening the analytical capabilities of the Federal Statistical System, provide a resource and expanded capacity for conducting program evaluation, and, at a practical level, create an infrastructure for more rapid responses to congressional policy inquiries about outcomes and programmatic impacts. This can be achieved within the strong privacy framework laid out in the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) reauthorized by the Congress in 2018, consistent with what is proposed in the NSDS Act. 

    In our view, the National Secure Data Service is a necessary part of modernizing our country’s data infrastructure. We encourage the Congress to include this bipartisan proposal in the conference agreement for the House NSF reauthorization bill and the Senate USICA. We are happy to speak with you or your staff on any questions about the Evidence Commission’s recommendations or the NSDS Act. 

    Download the Letter Here

    1. U.S. Commission on Evidence-Based Policymaking. (2017). The Promise of Evidence-Based Policymaking: Report of the Commission on Evidence-Based Policymaking. Washington, D.C.: GPO. Available at:
    2.  Hart, N. and N. Potok. (2020). Modernizing U.S. Data Infrastructure: Implementing a National Secure Data Service to Improve Statistics and Evidence Building. Washington, D.C.: Data Foundation. Available at:
    3.  Lupia, A. (2021). Letter to the Data Foundation RE National Secure Data Service. National Science Foundation. Available at:

  • November 16, 2021 9:00 AM | Anonymous member (Administrator)

    Earlier this year, the Chief Data Officers Council requested public feedback on how they can continue to improve the government’s management, use, protection, dissemination and generation of data in government’s decision-making and operations.

    In response, the Data Coalition hosted a virtual public forum to create an opportunity for the data community to offer feedback, recommendations, and advice to the federal CDO Council. As a result of that forum, as well as research informed by the Data Foundation's CDO Insights Survey, we offered the following 12 recommendations to the CDO Council.  Our full comments are available here.

    • Recommendation 1 – CDOs should work with their agency CFO and OMB to increase CDO funding flexibilities and direct resources.  Most CDOs do not have adequate resources to fulfill their statutory responsibilities and support agency missions. CDOs need sustained, predictable, and adequate resources to implement data priorities. Congress should authorize CDOs to use additional funding flexibilities and set-aside authorities, as well as provide increased direct appropriations for CDOs to succeed. This longer-term resourcing plan aligns with the congressional intent in establishing the CDO role through the Evidence Act, which created the position indefinitely rather than for a short-term period. 
    • Recommendation 2 – CDOs should work with OMB to clarify responsibilities and expectations. While CDOs are operating with their peer community of practice and under the general framework of the Evidence Act and the Federal Data Strategy, additional guidance from OMB can help align emerging priorities from the administration with the activities implemented by CDOs. In addition, CDOs will benefit from  clearer expectations on reporting requirements, including how to address required due dates and expectations about what should be reported to OMB, Congress, and the American people. Additional guidance could also include more tactical direction about what steps CDOs should take, how to  prioritize the steps, and areas for interagency cooperation and collaboration. 
    • Recommendation 3 – Congress should remove the statutory sunset of the CDO Council. Currently the CDO Council is scheduled by law to sunset in 2025. It has proven itself to be a valuable coordinating body and community of practice for CDOs. The CDO Council provides vital technical assistance and a valuable community of practice to convene and share knowledge. Since most CDOs and their offices are relatively new to the role and responsibilities, additional support from the CDO Council and peers in the form of technical assistance, resources for strategic planning, and other planning processes can support the entire CDO community, including for CDOs operating with limited staff and capacity. With the expectation that CDO roles continue indefinitely, the coordination of the CDO Council should as well. The CDO Council or OMB should include this request in an appropriate forum, such as the package of FY 2023 President’s Budget legislative proposals.

    • Recommendation 4: The CDO Council should work to create an ecosystem of data-literate and data-fluent workers. The need for more staff capacity was the top request articulated by CDOs in the Data Foundation’s CDO Insights Survey. CDOs did not just request FTEs, but rather to add specific highly-skilled data scientists, data architects, and data engineers required to successfully carry out data governance and management activities. One cross-agency effort that was viewed positively by participants was the January 2021 joint hiring initiative coordinated by the Office of Personnel Management. Ten agencies joined together to put out the call to hire 50 senior data scientists. In addition to getting high-level expertise into CDO offices, it is important that there are base levels of data literacy throughout the workforce in order to support a culture of data. CDOs should create a shared framework for data skills needed to support their agencies as well as definitions for various roles throughout their agencies and the types of skills required.
    • Recommendation 5: CDOs should emphasize their role as designated leaders to promote training and data fluency among staff of departments. Commitment from agency leadership to establish a strong data culture in agencies is critical for a coordinated training and retention strategy. This should include identifying current gaps in skills, capitalizing on existing training programs and models, and developing training programs when necessary. Specific limitations of privacy frameworks in which agencies are operating should also be addressed as part of training.
    • Recommendation 6: The CDO Council should work with the OMB to ensure that forthcoming implementation guidance to agencies on data inventories prioritizes machine readability and interoperability. Implementing and updating the metadata necessary for data inventories across federal agencies can be an intense process, representing a significant workload. In order to deploy automation technologies that reduce workload, as well as improve the quality of data inventories and the quality of aggregating services like, the metadata standards associated with these inventories should be machine readable and interoperable. Machine readable, interoperable metadata supports easier discovery and use of data, especially as the number of data sets within data inventories continues to grow. In addition to ease of discovery, machine readability allows for the automation of several processes that can help reduce burden on custodians of data inventories.
    • Recommendation 7: CDOs should focus on data sharing standards that facilitate interoperability, data linkage, and privacy. Standardization and creation of data standards are emphasized in the recently released Advisory Committee on Data for Evidence Building Year 1 report. The report provides a review of the state of data for evidence building in the federal government, particularly on opportunities for secure data sharing. There are a number of examples of entities that securely aggregate, integrate, and share information. We strongly urge the CDO Council to align its work with the efforts undertaken by the Advisory Committee. Additionally, data sharing approaches must prioritize application of robust confidentiality and privacy safeguards. Various tiers of access and pilot projects testing the use of privacy-enhancing data linkage and multiparty computation offered limited preliminary evidence about potential promise for such approaches, however, further investment into privacy-enhancing technologies is needed in government prior to operating at scale. 
    • Recommendation 8: Publicly accessible data must be prioritized. Ensuring that government data are easily accessible and usable aligned with efforts to promote transparency and accountability for government. Accessibility also facilitates collaboration with researchers, the private sector, and other levels of government, which can lead to more efficiency and innovation in public service.
    • Recommendation 9CDOs can improve communication about how they demonstrate the value of using data. Where possible, in coordination with the Evaluation Officer and the Statistical Official appointed under the Evidence Act, CDOs should engage in deliberate steps to provide metrics, summaries, and, when possible, evaluations that highlight the impact and cost savings of their efforts. To gain support within their organizations, CDOs need to show their leadership strategically, including valuable accomplishments that improve the ability for other staff to better perform the roles. In so doing, CDOs may help build a more compelling case about the need for resources to create and grow the staff and gain leadership buy-in within their agency. If the CDO is able to show programmatic savings in time and/ or dollars caused by their activities, they establish a base for justifying the use of existing resources and requesting greater resources in the future. Even small wins are vital to building support. CDOs also benefit in helping to manage organizational change, encourage data literacy, and increase the influence of evidence-informed decision making.
    • Recommendation 10: CDOs should conduct regular maturity assessments to accurately gauge existing data capacity and needs. Maturity assessments like the one required by the Evidence Act should be a continuous process rather than simply for compliance. Understanding the day-to-day operational needs for data and data skills will allow CDOs to effectively direct resources and training to areas within their agencies that may be in most need of support. By measuring levels of data literacy, use of data, and other aspects at the project level, CDOs can ensure that they are facilitating the growth of a strong data culture within their agency.
    • Recommendation 11:  The CDO Council should create a permanent data ethics working group to ensure the Data Ethics framework continuously meets emerging needs, to provide resources and guidance to agencies, and to partner with relevant professional associations for ongoing education and training on data ethicsThere is a need for clear, unified guidance from the CDO Council in regard to ethics and equity standards for data. Existing frameworks, such as the Federal Data Strategy ethics framework, provides guidance for developing a single standard going forward, but the CDO Council should collaborate with ethics-focused organizations outside of government to encourage  application of best practices and continuous improvement to those practices.
    • Recommendation 12: The CDO Council should work with CIOs to facilitate the adoption of appropriate modernized technology. Data collection, management, and analysis present unique challenges and needs for technology, such as automation to optimize data collection and tools that can streamline data collection, analysis, and storage. When adopting new technology to help support data functions, we encourage the CDO Council to partner with CIOs and other relevant stakeholders to leverage existing technologies, where possible, in order to avoid “reinventing the wheel.”

    The success of CDOs in the federal government hinges on their ability to perform expected and critical tasks. If they are successful, government data can be an asset, creating a robust data infrastructure that will serve a variety of purposes, including improving operational decision-making and evidence-based policymaking capabilities. While there are challenges, the progress of CDOs over the past year is commendable. We hope to continue a productive working relationship and dialogue with the Council going forward and are happy to respond to any questions you may have regarding these recommendations.

  • November 05, 2021 9:00 AM | Anonymous member (Administrator)

    Author: Amanda Hejna, Data Foundation Fellow, and Senior Associate with Grant Thornton Public Sector

    The Advisory Committee on Data for Evidence Building (ACDEB) was formed over a year ago to provide recommendations to the White House Office of Management and Budget (OMB) on how agencies can better use data to build evidence and improve decision making across the federal government and beyond. Composed of data experts from all levels of government and the public sector, the Committee was charged with forming a foundational understanding of the current state of and future needs for the use of data for evidence building and in doing so fulfill the spirit and vision of the Evidence Act. 

    Throughout the first year, the Committee focused particularly on developing a vision and framework for National Secure Data Service that would connect data users at all levels of government and the public and establish a unified evidence-building system across the federal government. At the culmination of Year 1, the Committee presented seven high-priority recommendations to the Director of OMB. These actionable and timely items will contribute directly to ongoing implementation of the Evidence Act and the establishment of a successful National Secure Data Service: 

    • Evidence Act Regulations: Provide additional guidance and regulations under the Evidence Act related to the operations and responsibilities of statistical agencies and implementation of the OPEN Government Data Act. 
    • Chief Statistician of the United States: Designate a full-time Chief Statistician of the United States within OMB.
    • Standard-Setting Procedures: Establish clear procedures for stakeholder engagement on future data standards for data sets government-wide. The importance of data standardization is a multi-faceted topic that includes considerations such as data quality, data definitions, legal frameworks, and reporting requirements, among others.
    • Appropriations Requests: Increase funding requests to support implementation of the Evidence Act and the Federal Data Strategy in the President’s Budget request to Congress in fiscal year 2023.
    • Value-Driven Pilot Program: Establish a pilot program including projects from federal agencies, states, and localities, to demonstrate the value of increased coordination and data sharing across government.
    • Privacy-Preserving Technologies Case Studies: Publish case studies where federal, state, and local governments used privacy-preserving technologies to encourage future, widespread use of these methodologies. This recommendation falls under the purview of the U.S. Chief Statistician in collaboration with the Interagency Council on Statistical Policy.
    • Communication: Develop a communication and education strategy to facilitate the success of a National Secure Data Service. This strategy should be developed by the U.S. Chief Statistician and should consider a wide range of stakeholders including the public, data providers, researchers, and policymakers at all levels of government.

    A number of subcommittees drilled down into specific focus areas and presented additional recommendations to the broader ACDEB. Focus areas included Legislation and Regulations; Governance, Transparency, and Accountability; Technical Infrastructure; Government Data for Evidence Building; and Other Services and Capacity-Building Opportunities. These preliminary recommendations will be integrated into the Committee’s Year 2 agenda as it looks to define the steps needed to fully operationalize the National Secure Data Service. In the next year, the Committee will continue to expand on its success to advance the use of data for evidence building and ultimately produce better results for the American people.

  • October 26, 2021 9:00 AM | Anonymous member (Administrator)

    Last week the White House Office of Management and Budget (OMB) released the Federal Data Strategy (FDS) 2021 Action Plan, an interagency effort meant to coordinate and leverage data as a strategic asset across the Federal government. Building upon the FDS 2020 and stakeholder engagement, the newly released strategy places emphasis on workforce development and data leadership within agencies.

    Part of the Executive Branch’s management agenda, the FDS is a 10-year plan to establish best practices for ethical data governance, management, and use. The FDS is an iterative process, with each Action Plan intended to incorporate lessons learned from agencies the prior year, public comments, and takeaways from conversations with data professionals from both government and non-government stakeholders –– such as the forum hosted last year by Data Coalition. 

    OMB identified major successes from Year 1 regarding the formation of agencies’ planning, governance, and data infrastructure foundation. For example, praise for the establishment of the interagency Federal Chief Data Officer (CDO) Council, the creation of a data upskilling pilot, and improvements to data inventories within

    Learning from Year 1’s successes, and identified challenges –– such as the need for more statutory requirements, published guidance on timelines, and additional interagency working groups –– the 2021 FDS lays out 11 action categories of 40 practices for agencies to implement going forward. Year 2 seeks to offer agencies more flexibility in achieving the Action Plan milestones in hope to meet agencies where they are in their foundational activities from FDS 2020. 

    Five out of 11 actions require specific interagency councils to identify pilot projects or government-wide services, highlighting the necessity of collaboration among data leadership. Some Year 2 practices include making public non-classified AI use case inventories, improved linkage and governance of wildfire fuel data, and creation of a data-skills training playbook. The 2021 FDS also reiterates goals from 2020, such as continued assessment of data to answer agency questions as well as maturation of data governance and infrastructure.

    Although the Data Coalition members appreciate the Year 2 strategy’s focus on workforce development and the role of data leadership within agencies, there are still many barriers to the next steps of implementation of improved data practices across the Federal government. On November 9, Data Coalition will be hosting a public forum to discuss key takeaways from the Action Plan, seek feedback to the Federal CDO Council’s recent Request for Information, and gather additional information on how to best assist in a collaborative effort to realize the full benefits of the evidence-informed policy in practice.

  • September 14, 2021 9:00 AM | Anonymous member (Administrator)

    Each year, federal agencies provide Congress with funding requests that explain the resources needed to run programs and achieve their missions. These publicly available requests, called congressional budget justifications, are not collected into a structured central repository which makes locating particular budget justifications challenging for congressional offices, federal agencies, White House staff, and the American taxpayer. 

    This bill seeks to provide open and transparent data about how agencies allocate resources, a pillar of accountable government. It will make it possible for Congress and the American public to better understand what their government is allocating resources to and to provide capabilities to analyze how budget proposals, appropriations, and budget execution have changed over time.   Relative to the federal government’s $4 trillion budget, the proposed legislation is a low-cost activity, estimated by the Congressional Budget Office to cost less than $1 million  per year to implement.


    The Congressional Budget Justification Transparency Act (P.L. 117-40) directs federal agencies to publish more information online about federal spending. Specifically, the bill would require:

    • Information on any funds made available to or expended by a federal agency be posted publicly.
    • Agencies to post their annual congressional budget justifications in a structured data format and in a manner that enables users to download the reports in bulk. 
    • The White House Office of Management of Budget (OMB) to coordinate a publicly-available website with a list of each justification by agency and  fiscal year. 


    Congressional budget justifications (CJs) are documents submitted by Executive Branch agencies to support the annual President’s Budget Request, typically in February. The justifications are intended to be plain-language explanations for how agencies propose to spend funding that they request from congressional appropriators, core priorities and performance goals, and a summary of past performance. 


    Agency budget justifications contain a wealth of information about agency performance and priorities but are published as large, unwieldy documents. Currently, agencies are only required to produce a machine-readable summary table for the budget submission, meaning many data elements and core features of the justification are not captured. 

    The absence of consistent, machine-readable data means the American public, congressional offices, third-party intermediaries, and even OMB staff must manually review and transpose information in the budgets for relevant analysis. Moreover, the lack of a structured database limits the accessibility of detailed budget proposals to those who know how to find them, which in turn limits transparency for the American public and clear opportunities for accountability and oversight. 


    There is no publicly-available, comprehensive list of agencies that must publish CJs. However,  according to a 2019 survey conducted by Demand Progress of 456 agencies, over 20% did not publish any CJs publicly. Only 13 agencies of those surveyed (3%) published their CJs online in both FYs 2018 and 2019. While all 24 Chief Financial Officers Act agencies (i.e., large agencies) were among those who did publish their CJs online, independent agencies were found to be especially difficult to locate, according to the survey. Demand Progress noted in their survey methodology that they found more than 40 alternative document titles. This lack of standards creates confusion, inhibits transparency, and causes roadblocks to those who need access to budget information to support decisions about resource allocation or to fulfill transparency and accountability goals.  


    Open and transparent data about how agencies allocate resources are a pillar that supports an accountable government. This bill will make it possible for Congress and the American public to better understand what their government is allocating resources to and to provide capabilities to analyze how budget proposals, appropriations, and budget execution have changed over time.   Relative to the federal government’s $4 trillion budget, the proposed legislation is a low-cost activity, estimated by the Congressional Budget Office to cost $500,000 per year to implement.


    Staff across federal agencies, congressional offices, and even the White House budget office spend countless hours searching, collating, and repurposing content for budget formulation activities each year. Part of this exercise often requires agency staff to review old congressional justification materials to identify historical funding trends. By simply adjusting how information is published, staff supporting budget formulation and execution across agencies and branches of government will be able to more efficiently and accurately portray budgetary information to support decision-making on resource allocations. The same is true for reviewing and applying agency performance measures to promote effective performance management in the budget formulation and execution processes. 


    OMB coordinates the federal budget formulation and execution processes. After annual budgets are developed and proposed funding levels agreed to within the Executive Branch, agencies are required to submit congressional justification materials for review and clearance by OMB staff. This requirement, established in OMB Circular A-11, dictates that agency justification materials align with the formal President’s Budget Request published annually by OMB. 

    OMB also requires agencies to publish justifications at a vanity URL ( following transmittal to Congress, unless exempted for national security purposes. However, while OMB publishes top-line budgetary information in the President’s Budget Request volumes, OMB does not provide a consolidated database or repository for agency justifications. OMB already publishes many other budget documents on a central website, and adding the CBJs to that site would be a useful resource for Congress, agency staff, journalists, watchdogs, and the general public.


    S. 272  passed the Senate in June 2020 and the House in August 2021. It is expected to be signed by the president in the coming days.


    Both the House and Senate versions have a bipartisan set of sponsors. U.S. Representatives Mike Quigley (D-IL) and James Comer (R-KY) ) in the House, and Sens. Thomas Carper (D-DE) and Rob Portman (R-OH) in the Senate.


    Campaign for Accountability

    Data Coalition 

    Demand Progress 


    Government Information Watch 

    National Taxpayers Union

    Open The Government 

    Protect Democracy 

    R Street Institute Senior 

    Executives Association Society of Professional Journalists 

    Taxpayers for Common Sense

    Union of Concerned Scientists

  • September 07, 2021 9:00 AM | Anonymous member (Administrator)

    It’s no secret that the government collects a trove of data from the American people – estimated to cost $140 billion each year. But the value of that information is much higher, if it can be successfully and securely applied to make decisions about policies that improve lives and the economy. 

    Four years ago, the 15-member U.S. Commission on Evidence-Based Policymaking issued its final report to Congress and the President. Since then, while the world has changed drastically, the vision from the Evidence Commission is more relevant than ever: to enable the use of data in our society to solve real problems for the American people.  

    The Evidence Commission accomplished its mission with just 18-months to learn about the nature of the country’s data challenges, study the contours of potential solutions, and reach a bipartisan agreement on salient, timely recommendations. It is already a major success story to be emulated by future government commissions, and the impact is still ongoing. 

    During a press conference on Sept. 7, 2017 releasing the final Evidence Commission recommendations, then-Speaker Paul Ryan and Senator Patty Murray stood side-by-side to applaud the realistic, practical solutions offered by the commission members. Speaker Ryan said: “it’s time to agree where we agree.” And in that spirit, days later, Speaker Ryan and Sen. Murray jointly filed the monumental Foundations for Evidence-Based Policymaking Act (Evidence Act). 

    Enacted 16-months after the commission’s report with overwhelmingly bipartisan support in Congress, the Evidence Act was the most significant government-wide reform to the national data infrastructure in a generation. The Evidence Act created chief data officers and evaluation officers in federal agencies, established processes for planning data priorities and research needs, required government data to be open by default, and enabled new data sharing capabilities within one of the world’s strongest privacy-protective frameworks. In short, the legal authority of the Evidence Act was a game changer for how our government responsibly manages and uses data. The work to implement that law is now ongoing across federal agencies.

    The Evidence Act also has tremendous implications for state and local governments, federal grantees, researchers, and even allies on the international stage. The law positions the United States as a clear leader in the dialogue about producing useful evidence for decision-making, while also shifting the discourse about the role of data infrastructure in supporting basic program administration. 

    What’s possible today that was not four years ago? A lot. Take for example the recent efforts to improve talent in the federal government by aligning roles under the chief data officers and evaluation community. Agencies like the US Department of Agriculture are launching new enterprise data capabilities to understand what data they have and use it. Coordination across new data leaders is producing new innovations for the government, like the use of natural language processing to accelerate the review of comments on federal rules. Real dialogue is now underway to break down the barriers and silos of data within agencies, and promote more public access. A new portal for researchers to have a one-stop-shop for applying to access restricted data is under development. New pilot projects of privacy-preserving technologies are underway as public-private partnerships. All of these activities will lead to greater capacity to use data and, therefore, better information to solve the government’s most wicked problems. 

    While real progress is being made, there are other areas ripe for attention from leaders at the White House where implementation of the Evidence Act has lagged. Here are two examples:

    • Presumption of Accessibility Regulation — A key recommendation from the Evidence Commission included in the new law was to assume that data are sharable unless prohibited by law or regulation. This presumption of accessibility requires the White House Office of Management and Budget (OMB) to first take a regulatory action, which has disappointingly not yet even been published in a draft form for public feedback.
    • Guidance on New Open Data Requirements – The Evidence Act’s requirement that agencies make more data accessible and open is also paired with new transparency requirements about agencies inventorying data and publishing information about key contents of datasets. These nuanced activities require OMB to also issue guidance to agencies to facilitate consistency across federal agencies as well as prioritizing which high-value data should be made first.

    The Evidence Act was a starting point, but there is still yet more work underway to implement the Evidence Commission’s recommendations. Earlier this year, Rep. Don Beyer filed the National Secure Data Service Act as a strategy to take many of the commission’s remaining recommendations for a new infrastructure capable of securely combining data, creating a pathway for implementation. That bill quickly passed the U.S. House with strong bipartisan support and is now awaiting further action in the Senate. In parallel, the new Advisory Committee on Data for Evidence Building continues to study the challenges identified by the commission and is devising recommendations that will also further address the Evidence Commission’s work. 

    While much progress has been made based on the commission’s advice, there is still a long path ahead in the United States to implement effectively and ensure the remaining recommendations come to fruition. Importantly, the Evidence Commission is itself an example for how to develop and use evidence in policy making. Fortunately, because of the commission members’ diligent service to the country and the leadership from Speaker Ryan, Sen. Murray, Rep. Beyer and others, the country is well on its way to realizing the promise of evidence-based policymaking.

  • August 24, 2021 9:00 AM | Anonymous member (Administrator)

    Author Austin Hepburn, Research and Policy Intern, Data Foundation

    On the first day of their Administration, the Biden-Harris team issued an Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government (Executive Order 13985). The executive order was issued to promote and protect equitable policies and data in the Federal Government. These efforts supported the inclusion of marginalized groups in Federal research and analysis, the improvement of equitable policies, and to provide each person with the opportunity to reach their full potential.   

    In order to ensure the implementation of the program, the White House Domestic Policy Council (DPC) is “directed to coordinate the efforts to embed equity principles, policies, and approaches across the Federal Government.” This includes efforts to remove systemic barriers, develop policies to advance equity, and encourage communication between the National Security Council and the National Economic Council. As noted in the EO, it is the responsibility of the Office of Management and Budget (OMB) to analyze and “assess whether agency policies create or exacerbate barriers to full and equal participation by all eligible individuals.” This responsibility is key to identifying and quantifying the challenges toward equity. 

    The Executive Order recognized the important role of disaggregating data, or data that has been broken down by detailed sub-categories, such as race, ethnicity, gender, disability, income, veteran status, and other key demographic variables, by creating the Equitable Data Working Group. The Working Group has been tasked with “identifying inadequacies in existing Federal data collection infrastructure and laying out a strategy for improving equitable data practices in the Federal government.” This is accomplished through the collection of new data or through the combination of multiple data sources in order to fill the data gaps that make assessments of equity difficult, which in turn supports evidence-based policies within the Federal government and state and local governments through vertical policy diffusion. “By exploring key policy questions dependent upon underutilized, inaccessible, or missing data, the Equitable Data Working Group explores ways to leverage government data in order to measure and promote equity.” 

    Despite overwhelming positives in exposing gaps of data, the Group recognizes there are possible unintended consequences when considering privacy and the vulnerability of underserved populations. With this in mind, aggregating data into summary data can help understand broad trends within these communities without disseminating personal data. For example, the National Crime Victimization Survey (NCVS) collects data on self-reported accounts of criminal victimization. The NCVS produces reports that break down victimization data by race, ethnicity, gender, age, marital status, and income. However, once the data is able to be separated by race, researchers and analysts can provide summary statistics and better insights into disparities, without exposing personal identifiers. This protects the privacy of those who have been surveyed while still leveraging the data collected, while helping us answer important policy questions about crime. 

    The Data Coalition Initiative will be looking for how the Working Group is approaching these issues when its first report is provided to Ambassador Susan Rice, Assistant to the President for Domestic Policy, this fall, which will identify and discuss the barriers and gaps of equitable data identified through case studies, along with recommendations on how to address these problems.

    The Working Group report will also include a plan to foster new partnerships among Federal agencies, academic and research partners, state, local, and tribal governments, community and advocacy groups, and other stakeholders, in order to leverage Federal data for new insights on the effects of structurally biased policies, and to advance capacity for multilayered, intersectional analysis of Federal datasets. The Data Coalition is looking forward to the chance to engage with the Working Group on its efforts, and will continue to provide updates as their important work progresses. 

  • July 28, 2021 9:00 AM | Anonymous member (Administrator)

    Author Austin Hepburn, Research and Policy Intern, Data Foundation

    The nation is preparing to send its children back to school this fall, but there will be many questions about the  on-going impacts the pandemic has on our children, both in the short and long term. While there are a great many strengths of our country’s educational infrastructure, the data infrastructure applied to improving learning and the workforce continues to face substantial gaps. In order to understand, adapt to, and mitigate the impact of the pandemic, we must ensure that there is a robust data infrastructure. One way to ensure there is timely useful data about our learners and workers is to provide significant and sustained funding for the Statewide Longitudinal Data Systems (SLDS). 

    SLDS is a Federal government program that allows and provides access to historical data on public-school enrolled students and teachers starting from the 2006-2007 school year. The SLDS system was designed to improve data-driven decisions impacting student learning and education. It focuses on the connection among PreK, K-12, postsecondary, and workforce education data. School districts, public schools, and teachers can access the data system via their district’s Student Information Systems (SIS). It is accessible through a free application that is available to eligible state grant recipients, such as school districts, schools, and teachers. This data includes assessment scores, daily attendance, enrollment, courses, and grades. In its most advantageous state, it enables grantees to link individual level data from Pre-K to the labor market.

    The SLDS plays a significant role in creating data-driven policies. While the information is collected and stored, the grant program also provides more accessible data in order to get a better understanding of a policy’s impact on student learning. Moreover, it encourages policy efficiency and equity by quantifying educational measurements over time. Data-driven systems such as SLDS provide transparency about which policies affect students and the significance of their impact.

    The SLDS has meaningful benefits, although there are also challenges when implementing a data-driven program. States have been able to put this data to work to better support students on pathways to the workforce. Currently, every state, the District of Columbia and Puerto Rico has an SLDS that connects data between some data systems, but few can connect early education, K-12, postsecondary, and workforce. This makes it challenging to study and evaluate programs intended to improve outcomes in college and the workforce. As states and federal programs strive to boost education attainment and close the skill gaps in the workforce, it is vital that our country has the ability to produce rigorous analyses based on high-quality data.

    New, sustained investment in SLDS data can provide the important information to answer the critical questions policymakers, educators, as well as parents and students. This will require a significant, multi-year investment of $1 billion. This funding should focus on modernizing SLDS data systems to build more interoperable and accessible data platforms with privacy-preserving technology as well as building capacity to use SLDS data through state research-practice partnerships that bring both real-time learning and longitudinal data, as well as diversify representation of practitioners. Finally, funding should be directed to ensuring robust governance and accountability structures are put into place to ensure these systems transparently address the real priorities, needs and community expectations. 

    Not only is this funding necessary to improve the data infrastructure to meet the needs of learners and workers, it is necessary to make this a sustained funding level, so that these systems have the resources to evolve to meet ever changing research needs and privacy protection. 

    Sustained and continued financial investment in the SLDS program would help ensure data-driven success and proper-use of the data. An increase in funding will help provide the much needed update to the data infrastructure necessary to advance evidence based policymaking, and modernize privacy protection. Providing this funding for SLDS is smart investment that ensures we will have the evidence and data to provide the best outcomes for our students.

  • July 07, 2021 9:00 AM | Anonymous member (Administrator)

    Author Austin Hepburn, Research and Policy Intern, Data Foundation 

    Crime data – which includes data on types of crime, demographics of victims and perpetrators, corrections, recidivism and reentry and court information – is crucial evidence that is used to inform policy decisions in all jurisdictions. In the United States, national crime data is aggregated at the federal level, by the Department of Justice’s Bureau of Justice Statistics (BJS) and Federal Bureau of Investigations (FBI). Reliable and up-to-date criminal justice statistics are imperative in order for policymakers to make evidence-based decisions. However, as questions around policing and criminal justice become ever more pressing, it is worth exploring the challenges and limitations of crime data so that we may identify opportunities to improve both the data, and the policy decisions informed by the data.

    The National Crime Victimization Survey (NCVS) is the nation’s leading source for statistical information on victims of crime. Victim, offender and crime characteristics are reported along with reasons for reporting or not reporting the crime. However, there are serious limitations to the NCVS. The survey is self-reported by the victim and is not recorded when a crime is committed nor when there is victim to a crime. A sample of households in the United States are taken every 3.5 years and households are interviewed 7 times within that span. Only 71% of households sampled responded to the survey in 2019. Since the survey is only a sample, it does not capture variation in victimization patterns at the local or state level. Therefore, victimization patterns at the city level would require additional research. Additionally, data on the effectiveness of policing practices when addressing crime and victimization is lacking from NCVS reports.

    An example of the type of effective additional research is the Data Foundation’s Policing in America Project a multi-pronged, open data effort to systematically improve evidence about how the American people view the criminal justice system and police forces.  The project focuses on the value of building data capabilities to enable a more robust understanding of the relationship between perceptions of law enforcement agencies and the conditions in select cities, including disparate perceptions by sub-populations. 

    In addition to the NCVS, which relies on traditional surveys of victims, a good deal of crime data is reported to the Department of Justice by local law enforcement. The FBI’s Uniform Crime Report (UCR) has been used to provide crime statistics since 1930. The BJS, the primary statistical agency of the Department of Justice, uses UCR data in their publications and datasets. BJS has long been trusted to publish up-to-date and accurate information, utilized by academia and professionals for criminal justice reports and open access data. The data from BJS and the UCR includes local, state, and national level data on corrections, courts, crime, the Federal justice system, forensic science, law enforcement, recidivism and reentry, tribal crime, and victims of crime. The data is reported by local law enforcement agencies to form a national database of criminal justice statistics. In practice, this has led to incomplete and non-standard reporting to the FBI. Local jurisdictions may have different definitions of crime that can make uniform crime reporting difficult. There may be lags in reporting for local agencies, as well as incomplete data.  

    One challenge comes from inconsistent reporting from local law enforcement agencies (LEAs) which can make arrests difficult to calculate. Reporting data is voluntary, so LEAs may not always report the same data every year. But UCR only uses data from these voluntary reports. The procedure to calculate the aggregated national county and state arrest rates does not take into account the population covered by the UCR. Due to the variable population in UCR coverage each year, this would have a significant effect on the arrest rates. This proposes serious problems in analyzing national time series (over time) trends. Perhaps the main limitation on UCR data, however, is the difference between actual and reported crime. 

    In addition to inconsistent reporting, the data that is reported is not standardized. Some states may have differing definitions of crime, as well as wholly different crimes on the books. One example would be some states, such as Minnesota, have a 3rd degree murder charge, whereas other states would classify that as manslaughter. Similar challenges exist with hate crime statutes, which may be vastly different, include different demographic information, or may not be a part of the criminal penal code. 

    Timeliness of data is also incredibly important for informing policy, but there are significant lags in crime data. As of June 2021, aggregated arrest data was last reported in 2016, half a decade to date. Despite the availability of raw arrest data on the FBI’s Crime Data Explorer up to 2019, the Bureau of Justice Statistics has not reported the arrest figures. This means that any policy decisions based on crime data are based on data that is missing the most timely insights. 

    And finally, data needs to be usable by the public, academic researchers, and policymakers. This means that it needs to be published in an accessible format. Crime data has some significant challenges in this respect. But there are tools to help. The BJS Arrest Data Analysis Tool allows researchers to find national estimates and/or agency-level counts of crime. This data is sourced from the FBI’s Uniform Crime Report. The tool is significant in that viewers are easily able to generate arrest figures at the national and local level without needing the data science background required for raw data processing. While some of these challenges are unique to the crime data, many of these challenges exist in the data infrastructure throughout the country. Many initiatives are being undertaken in order to help address these problems, and optimize data for evidence-based policymaking. 

    The Foundations for Evidence-Based Policymaking Act of 2018 became law on January 14th, 2019. The bill requires available agency data that is accessible and reports that utilize statistical evidence to support policymaking. Annually, agencies must craft a learning agenda to address policy concerns to the Office of Management and Budget (OMB). This is an opportunity for the Department of Justice to identify and address what needs to be improved and work with stakeholders to ensure that the necessary improvements can be made. This includes an Open Data Plan that must detail how each respective agency plans to make their data open to the public. 

    Investment in crime and policing data has been modest, preventing meaningful updates to data collection and modernization. Additional funding for criminal justice data collection and reporting is recommended. Increasing local law enforcement training on reporting and correctly classifying all crimes to the FBI can help increase accuracy and reliability. In addition, increased efforts for interagency collaboration between local law enforcement agencies and the FBI can provide more accurate aggregate data. With these common-sense improvements, crime data can be more effective in helping craft evidence based policymaking. 

1100 13th Street NW, Suite 800
Washington, DC 20005 USA

return to data foundation

Powered by Wild Apricot Membership Software