4 minute read

5.3 What Do We Know about Information Interventions?

Next Article
Information

Information

interagency coordination problem. However, they might choose not to do it, perhaps for political economy reasons. Private entrepreneurs could collect at least part of it—for example, through internet scraping. Further, good programs have every incentive to track down and advertise their graduates’ outcomes— particularly if their revenues depend on enrollment, as is the case for private institutions. Although low-quality programs have incentives to (falsely) depict themselves as good, this might be averted by requiring a third-party audit of selfreported information (coding boot camps in the United States are already following this practice). Although private solutions to information provision are therefore possible, governments have an unsurpassed advantage at collecting the information and ultimately need it for regulation purposes.

It should be mentioned, however, that collecting all the relevant information and making it easily available might not necessarily affect students’ decisions. The evidence indicates that what information is provided, to whom, and how, matters greatly (box 5.3). Light-touch interventions, such as posting information on a website, sending an email, or nudging students with text messages, generally fail to alter students’ behavior. These interventions are impersonal and do not engage the student directly; she might not see the information or consider it useful or reliable. In contrast, high-touch interventions, which are more direct and intensive, affect students’ choices. Frequent sessions with a counselor are an

Box 5.3 What Do We Know about Information Interventions?

Most information interventions that are reported in the literature have aimed at providing prospective students information related to college access. Scholars seem to disagree about the effectiveness of informational interventions in affecting student behavior, yet part of the disagreement may stem from a lack of agreement on what constitutes an informational intervention. Interventions vary in what information they provide (such as availability and characteristics of programs), when they provide it (how far in advance before high school graduation), and how (whether in a light- or high-touch fashion). Light-touch interventions mail information to students (Hoxby and Turner 2013; Gurantz et al. 2021; Bergman, Denning, and Manoli 2019; Hyman 2020), nudge them (Castleman, Deutschlander, and Lohner 2020; Oreopoulos and Petronijevic 2019), or post information on a website (as is the case of government information provision, studied by Hurwitz and Smith 2018 and Baker 2020). In contrast, high-touch interventions engage the student directly and intensively, for instance through repeated counseling sessions (Bettinger and Baker 2014; Oreopolous and Ford 2019; Bettinger and Evans 2019; Mulhern 2020).

In general, the evidence shows that light-touch interventions do not affect behavior (Page and Scott-Clayton 2016), but high-touch interventions do. However, there are some nuances to consider.

Although merely posting of information on a website is not effective, interactive websites that tailor information to the student and mimic the role of a counselor are effective

box continues next page

Box 5.3 What Do We Know about Information Interventions? (continued)

(for example, Naviance in the United States, studied in Mulhern 2021). Further, mailing information to students is effective when the message is personalized and targeted to specific students who find it credible and relevant, and when it is also sent to “influencers” close to the student (as in the HAIL experiment at the University of Michigan, studied by Dynarski et al. 2020).

A few studies have examined the impact of providing major or program-specific information to students. Researchers using an experimental design have found that information led students to switch to higher return options in Chile (Hastings, Neilson, and Zimmerman 2015) and the United States (see Conlon 2019 for four-year colleges and Baker et al. 2018 for community colleges). In Chile, the provision of information also leads to higher persistence in college (Hastings, Neilson, and Zimmerman 2015).

Some recent interventions in the Dominican Republic and Peru have provided middle and high school students videos that teach them about education’s value and returns (see J-PAL 2017 for Peru). The interventions have lowered dropout rates among low-performing students and affected the field of study among high-performing students. Such interventions have been scaled up in Peru and the Dominican Republic and have recently been implemented in Chile.

Of critical importance is the quality of the information provided. As discussed in the main text, an ideal information system would keep track of all the higher education programs that are available in a country and their basic characteristics, such as duration and cost. It would also keep track of all the students in higher education—particularly those who graduate—and follow them into the labor market to allow for the calculation of program-level average returns and employment rates. Ideally, countries would also have disclosure platforms (for instance, websites such as www.mifuturo.cl and www.ponteencarrera.pe in Chile and Peru, respectively) where program-level information can be easily found. Although data collection and disclosure do not affect behavior by themselves, they provide the inputs necessary for the interventions that do affect it.

example, as are interactive websites (such as Naviance in the United States) tailoring information to the specific student, parents, and counselors. Students often overestimate the returns to their chosen programs, do not know about similar programs offering higher returns, and are generally misinformed about returns to various fields and programs. Nonetheless, they do alter their choices in response to well-designed interventions, as indicated by evidence from the United States and Chile.

There are cases in which even a well-designed intervention might fail to affect student choices. Students might still choose a program with relatively low returns just because it is local or offers something they value (for example, convenient schedule, online teaching, on-site child care, quiet place to study.)7 Alternatively, students may simply not have other choices, as is the case of those who live in small or medium-size municipalities (chapter 3), or cannot afford anything more expensive than their current program. As

This article is from: