SKILLS BLOG

What State Leaders Need to Know about Measuring Digital Skills: Options and Opportunities

By Amanda Bergson-Shilcock, February 13, 2024

As implementation of the $2.75 billion federal Digital Equity Act gets underway, state broadband officials and other policymakers are hurrying to put plans in place to measure the impact of these new investments. A key area of focus is digital skills – how to measure the baseline of residents’ current skills, what data digital skills program providers will need to collect and report on, how to set targets for improvement, and more.

National Skills Coalition is providing this overview to help inform state policymakers and advocates who are wrestling with these important questions in their communities. As a nonprofit, bipartisan workforce and education policy organization, we have been at the forefront of documenting digital skills demand, needs, and policy opportunities:

  • Our New Landscape of Digital Literacy report drew on data from the OECD Survey of Adult Skills, finding that 31 percent of US workers ages 16-64 had limited or no foundational digital skills.
  • Our Applying a Racial Equity Lens to Digital Literacy fact sheet used the same dataset to take a deeper look at workers of color and the barriers they face in obtaining digital skill-building opportunities.
  • Our Closing the Digital Skill Divide report used data from 43 million “Help Wanted” ads to understand employers’ demand for digital skills in today’s job market and explore policy implications for education and workforce development.

Defining digital skills: What digital skills do people need to have?

The field has not yet coalesced around a single list of digital skills that all individuals should possess. Given the diversity of ways that technology can be used and its rapidly evolving nature, a full consensus may never emerge.

However, there are some rigorous and well-developed digital skills frameworks that can help state leaders decide on a key subset of skills to prioritize. One especially useful resource comes from the federally funded Digital Resilience in the American Workforce (DRAW) research project, for which NSC’s Amanda Bergson-Shilcock is an advisor. It is a short, easy-to-read overview called Assessing and Validating Digital Skills. Among the frameworks it reviews are:

Looking beyond foundational skills to more specialized digital skills, the particular digital skills needed to obtain and maintain employment are a top concern for individuals and state officials alike. To understand the specific digital workplace skills that are needed in their communities, state broadband offices can tap into the expertise of their peers in state labor and education departments and workforce development boards.

In addition, NSC’s Boosting Digital Literacy in the Workplace provides examples of how some employers are tackling the issue of upskilling and reskilling their employees, and the public policies that could encourage more businesses to do so.

Options for measuring skills: What resources are already out there?

Policymakers and advocates can use a variety of approaches to better understand state residents’ current levels of digital skills and access. Such approaches include:

1. Analyzing existing survey data on self-reported skills.

  • The US Census Bureau and the National Telecommunications and Information Administration (NTIA) collaborate to produce the Internet Use Survey every two years. This survey is conducted as a supplement to the well-known Current Population Survey, so it is a robust, rigorous, and nationally representative dataset. The most recent data available is from 2021; the 2023 data should be released in roughly June 2024. Visit NTIA’s data explorer and researcher page for more. You can also view the 2023 survey questions to see what kinds of data are included (though note that not all of these questions were included in the 2021 survey).

  • State broadband leaders can also inquire among their peer state agencies about what kinds of surveys or data they may already have gathered on this topic, and if information can be shared (in anonymized form) with their office. Good possibilities are departments or agencies focused on labor and workforce, adult and higher education, libraries, economic development, and human services. For example, adult education agencies may have collected baseline data on adult learners’ digital skills as part of enrollment pre-testing for High School Equivalency or English for Speakers of Other Languages (ESOL).

2. Analyzing existing state administrative data as a proxy for skills and access. In the course of providing services to state residents, state agencies already collect a wealth of administrative data that can help illustrate residents’ digital skills.

  • For example, a state might have data on whether Unemployment Insurance claimants are accessing information via the phone or website. If so, the state could analyze user data to determine which zip codes or demographic groups are more likely to be using the phone, which could be an indicator of limited digital skills or access for those groups.

  • Similarly, states may collect information on which browser or operating system users or applicants to a state website are using, which could help identify which groups of residents are more likely to have smartphone-only internet access rather than having a desktop or laptop computer.

  • State workforce development boards, American Job Centers, or higher education coordinating boards may collect data on individuals’ participation in online and virtual learning opportunities. Again, while individual privacy and confidentiality must be protected, it may be possible to review aggregate data that helps to illuminate skill levels. (Note that AJCs may be branded under a different, state-specific name.)

3. Collecting new data on self-reported skills via surveys. If states elect this option, it is critically important to make sure survey questions are well-designed, to avoid gathering data that will not be helpful. State leaders are encouraged to borrow from existing survey templates and/or work with experienced survey design professionals to ensure that their materials accomplish the goals they have in mind. One such template is available from the nonprofit National Digital Inclusion Alliance.

  • Somes states already fielded their own original surveys as part of State Digital Equity Planning processes. One of the most comprehensive examples (pre-dating the Digital Equity Act) is the Digital Readiness Survey carried out by the Hawaii Department of Labor & Industry in 2021. Hawaii’s effort is notable in that it included both telephone and online components, thus ensuring that the state captured data from residents who do not have or use internet access. (See page 73 of the above-linked report for the exact questions asked.)

  • A faster, cheaper technique is to add a handful of new questions to an already-scheduled public survey. NSC profiled one such example in this 2022 blog post, highlighting how the Colorado Office of the Future of Work and the Department of Public Health & Environment partnered to ask questions as part of a health survey. Results helped shed light on factors that influence state residents’ access to telehealth, and documented disparities in access between different demographic groups.

4. Collecting new data on skills via testing. This option is the most time-consuming and expensive, but it has the advantage of providing an external, objective, and standardized set of data. If the individuals participating in assessment are a representative sample of state residents, results can be generalized to the broader population. Even if they are not statistically representative, results can provide additional context to state leaders about digital skills among the given subset(s) of state residents who participated.

  • For example, workforce leaders in one Western state partnered with several local employers to administer the Northstar Digital Literacy Assessment to their incumbent workers. While this was not by any means a random sample, it did help state leaders affirm that these workers were slightly more likely than the national average to need investment in their digital skills. (National comparison data was taken from NSC’s analysis of PIAAC data, published in The New Landscape of Digital Literacy.)

5. Collecting new data on skills as part of Digital Equity Act program reporting. States will be receiving millions of dollars in federal formula funding (known as Digital Equity Capacity Grants) in Spring 2024 and will then be redistributing those funds down to the local level. This process provides a golden opportunity for states to collect new data on digital skills with a minimum of additional expense or effort.

As states are designing the grant reporting requirements that local program providers will need to abide by, NSC recommends establishing a simple, standardized set of measures that all digital skill-building programs will report on. Having a set of common measures is crucial to providing officials and members of the public with easy-to-compare results.

These measures should be connected with outcomes – that is, observable changes in ability or capacity – and not simply outputs or numbers of activities carried out. Outcomes allow stakeholders to gauge whether programs are actually helping people achieve intended goals. They can help state leaders identify potential bottlenecks where participants are getting stuck or not flourishing, as well as springboards – that is, programs that are doing an especially good job of helping people build digital skills.

As a workforce and education advocacy organization, NSC’s recommended common measures focus on those areas:

  • Number and type of digital skills program slots established or expanded, disaggregated by type of training provider (nonprofit organization, higher education institution, worker center, etc.); type of training (foundational/basic, applied/industry-specific, or advanced digital skills); and geographic location (urban, suburban, rural). States should also explore options for measuring program quality.
  • Number and percentage of individuals who have achieved a measurable digital skill gain, disaggregated by type of gain (foundational/basic, applied/industry-specific, or advanced digital skills) and covered population.
  • Number and percentage of individuals who have attained a quality non-degree digital skills credential, disaggregated by covered population. (Learn more about how states are defining quality in non-degree credentials more generally; these guardrails can easily be applied to digital skills credentials in particular.)

NSC strongly recommends that states give program providers flexibility in how they can show they are achieving these common measures. For example, states can allow providers to use any of several different options to demonstrate that participants have made a measurable digital skills gain – including pre/post testing, credential attainment, employment promotion/advancement, and others.

This flexibility gives local providers vital flexibility in designing programs that respond to the real needs of people on the ground, without shoehorning all participants into a single type of assessment. Similarly, states should allow providers to report on any type of credential that meets quality guidelines, without “picking winners” by selecting just one credential that all providers must use. (Stay tuned for an upcoming post from National Skills Coalition that explores these quality considerations in more depth and illustrates how they can be applied in a digital skills context.)

NSC also recommends that states collect additional qualitative data from a subset of programs. This data can add richness and depth to the quantitative measures described above, and can point the way to future improvements. Collecting this data in a limited fashion – perhaps by contracting with an evaluator to conduct interviews with a small percentage of programs – can be a cost-effective way to gather valuable information from:

  1. Program participants about what inspired them to enroll in digital skills training and how they have defined success for themselves;
  2. Program providers about how they define success in digital skill-building and what they have learned from trying to apply the required measures listed above;
  3. Employers about how they gauge digital skills among jobseekers and workers, and their experiences hiring individuals who have completed Digital Equity Act-funded training programs.

Across all of the above recommendations, states are strongly encouraged to collect basic demographic data without adding unnecessary burdens. It is important to strike a balance between collecting enough information that it is possible to track success in closing equity gaps for covered populations, without imposing on individuals’ privacy or unnecessarily burdening program providers with complex requirements. Data collection and reporting requirements should never be a stumbling block to improving equity for covered populations.

To that end, programs should be strongly encouraged to use proxy measures (such as whether a person resides in a high-poverty zip code or receives SNAP benefits), rather than attempting to assess eligibility on a case-by-case basis. This issue is especially urgent given the difficult circumstances faced by many covered populations. People with very low incomes, those who are incarcerated or recently returned from incarceration, and people with limited English or literacy skills are disproportionately likely to lack government-issued identification. No data collection requirement should further burden already-marginalized individuals with additional hoops to jump through before services can be obtained. 

Finally, states should follow best practices used in the public health and education fields and ensure that individuals born outside the United States are not required to demonstrate a specific immigration status in order to participate in digital equity programs or services. This flexibility in practice has already been adopted for the Affordable Connectivity Program by major Internet Service Providers, and by the Department of Education for its Workforce Innovation and Opportunity Act Title II adult education programs.