But how did outreach affect life in the real world?

Print More

Many viewers of the Bill Moyers documentary All Our Children in April 1991 were shocked to learn of the grim prospects for “at-risk” young people. What’s more, many of them did something about it, a follow-up study reported this summer.

The evaluation — one of the most extensive attempts to assess the influence of a public TV outreach project — found that 26 percent of viewers who called an 800 number after the broadcast said they were motivated to volunteer in schools or donate funds, or take other actions related to youth work.

Planners of South Carolina ETV’s Realizing America’s Hope project, which included the documentary, said the project had influenced new youth legislation in California, Georgia, Iowa, Missouri and New Mexico.

The evaluation — published in July by MDC Inc. of Chapel Hill, N.C. — went to the Charles Stewart Mott Foundation of Flint, Mich., which put up $1.5 million of the $2.7 million for Realizing America’s Hope.

“Both [Mott] staff and the trustees felt it was important to look to see if it had enough impact to warrant doing again,” said Judy Samelson, v.p. of communications at the foundation. “The evaluation generally suggests that we did in fact have tremendous impact with the project.”

Public TV outreach projects seldom get such a thorough follow-up, though funders like Mott increasingly are looking for evidence of impact.

Evaluation has been a major topic for outreach planners this year, says John Kasdan, the new director of the Public Television Outreach Alliance (PTOA). His colleagues are now considering what data they’ll need to assess the 1993 project, Take Action for Education.

Plenty of numbers come out of the typical outreach project, and WQED, Pittsburgh, has some impressive ones.

Counting the callers

For instance, in response to The Breast Test, a WQED-produced statewide special on breast cancer detection that aired on Oct. 26 [1992], volunteers answered more than 600 inquiries in Pittsburgh, 350 in Philadelphia, 200 in Erie and nearly 400 in the Scranton area. Ten thousand resource booklets went out.

In this summer’s project to ease voter registration, WQED sent callers more than 4,000 registration packets.

In the early ’80s, WQED got 6,000 calls in response to its Job/Help Network special and provided 20,000 pieces of information to callers.

For most outreach projects, Kasdan points out, evaluators can look for data at several different levels of measurement: How may people watched the programs? How many responded to offers? How many took actions based on what they had learned? And, finally, what impact did those actions have?

These latter questions would get closer to revealing “the real differences we make in people’s lives,” as Kasdan aims to do, but getting the data is far more difficult.

Most organizations don’t order up this kind of research because it’s expensive to do, methods are not well established, and it’s “a real measurement quagmire” to attempt to attribute changes in people’s behavior to any particular TV program or outreach effort, says Valerie Crane, president of Research Communications Ltd., a Dedham, Mass., firm that has done numerous outreach evaluations for CPB.

To date, the focus of most evaluations of PTOA’s annual projects has been at a more basic level than these questions. “The measure of success for the alliance is whether stations get involved and do things,” says Crane. By that test, public TV’s outreach efforts have been a “tremendous success story” and have transformed stations from “inanimate institutions” into active community participants.

Solid data from Kentucky

The strongest evidence of outreach effectiveness, Kasdan and Crane suggest, comes from Kentucky ETV, which has been promoting adult studies for high-school equivalency (G.E.D.) tests since 1975.

The state network says 19,446 Kentuckyans enrolled in the course built around its 43-part G.E.D. on TV series in 1975-89, and 11,452 passed the exam. Based on national studies of the economic impact of G.E.D. degrees, Kentucky ETV projected that those graduates will earn an additional $104.5 million over five years.

When the state network counted G.E.D. students nationally, including military people worldwide and prisoners, it came up with 1.2 million G.E.D. graduates over the same period, with a five-year economic impact of $12.2 billion.

The data could be compiled because the project lent itself to tracking. Kentucky asked all G.E.D. test-takers whether they used the TV series to study for the test, and then surveyed them to see whether they passed and got better jobs. Nearly half said the degree helped them get or retain a job or get a promotion.

Similarly, Mott’s project on at-risk youth could sample its impact because South Carolina ETV maintained an 800 number for a year and kept lists of people who had called for information.

Success stories from the phones

By phoning back to a sample of people who had called, evaluators collected evidence that the Realizing America’s Hope project had worked. For example:

  • A county commissioner in Washington State said he saw project materials in 1990 and within months his community had several model programs for juvenile offenders, which kept almost 1,400 kids out of detention.
  • In Minnesota, a college student said he decided to start his career by working with at-risk youth.
  • In Mississippi, a professor used the materials in her courses.
  • Two Floridians who attended a university workshop built around the project materials, went home and pulled together a one-stop service center for teens in their community.

Carol Lincoln, project director for MDC, was impressed by the hundreds of vignettes like these. “When you can call up people who you don’t know, and they have a concrete example of what they did, that tells me it was worth it,” she says.

Mott had hired MDC and Lincoln to do the 1990 study America’s Shame, America’s Hope, a subsequent videoconference moderated by Moyers, and then the Realizing America’s Hope project, including the documentary, two more videoconferences (one for state legislators and one for educators), print materials and a lot of free and low-cost distribution of papers and videocassettes. And then Mott hired MDC to evaluate the project. (Lincoln acknowledges that it’s “a tad questionable because we’re looking back on our own work.”)

At Mott, Samelson says evaluation gives funders a “higher comfort level” about spending on outreach. “If you feel something concrete has happened, you feel much better about investing the next go-around.”

Indeed, Mott has funded outreach work as part of its $500,000 grant to Henry Hampton’s series on poverty planned for air a year from now, Samelson says.

She believes the combination of PTV programming and outreach can reach a critical mass of active citizens and help put an issue on the national agenda. If policymakers sense a “groundswell” of interest in an issue in their constituency, she says, they are less likely to dismiss it.”

Lincoln and the MDC evaluation give several recommendations for future outreach projects: teaming with strong local organizations (in this project, local offices of a co-funder, Metropolitan Life Insurance, pitched in); giving three to five years for a project, including at least two years for planning; arranging toll-free hotlines; tracking every participant; and waiving copyright restrictions so that local collaborators can reuse materials as they choose.

Leave a Reply

Your email address will not be published. Required fields are marked *