A Brief History of How We Got Here

We finished the hard part. Now, it’s time for the hard part.

The Backstory

In 2016, the Pittsburgh Jewish Community Scorecard had three years’ worth of data. We collect information from organizations across the community: synagogues, schools, social service agencies, even summer camps. We track enrollment and membership numbers in order to make our communal progress visible.

We do it so that service providers can use the information to make more data-driven decisions for their institutions. We do it to change the culture of measurement in our community. To lead by example.

It was – and continues to be – an informative process. The Scorecard tracks trends in the community. It confirms, and sometimes dispels, the anecdotal evidence generally used to drive decision-making processes. Service providers often rely on it when mapping out their goals and market share.

But it wasn’t quite enough.

Seizing Opportunities

We knew all along that the Scorecard was going to be an iterative process. As it turns out, the next iteration involved a Community Study.

The Scorecard provides a numerator but not a denominator. How can we assess day school enrollment, for instance, if we don’t know whether the number of school-age children is growing or declining?

Further, the Scorecard is limited to information provided by the organizations themselves. We had no sense of the quality of service provided, no way to tell if there were gaps in services. All we heard was the voices of service providers, not  service recipients.

A Fortuitous Meeting

So, in 2016, we set out on a mission to fundraise. We convened a group of relevant foundations, including the Federation’s own Jewish Community Foundation (JCF), and pitched the idea of a Community Study. But it wasn’t until several months later at the Jewish Federations of North America’s General Assembly that the process truly gained momentum.

There, the chair of the JCF met several leaders from other communities who had just completed Community Studies. They told him how the studies informed their strategic plans. How they had a treasure-trove of information they could rely upon to learn about their markets.

“We don’t know what we don’t know,” he said as encouragement to the Community Scorecard to apply for a Signature Grant.

A Committee Uniquely Positioned to Succeed

The Community Scorecard Steering Committee is unique in its composition. Unlike every other Federation committee, it is comprised of both lay leaders and Jewish organizational executives. The turnover of our committee members is slower than elsewhere. We benefit both from our diversity and consistency.

For the purposes of this project, we invited several new members onto the committee, each of whom have expertise in a field related to statistics.

The committee advised that in our grant proposal, we don’t simply ask for the money to do a Community Study. Instead, we asked for a larger grant in order to conduct follow-up studies after the Community Study concludes. Perhaps a mini-study every two years would help us track the trends in the community. Perhaps a deeper dive into a specific impact area would allow for wiser strategic plans.

In early 2017, the Jewish Community Foundation awarded the Community Scorecard the funds to conduct a Community Study and four follow-up studies over the subsequent eight years.

The Community Scorecard included the follow-up studies in its Request For Proposals to the seven researchers that we invited to apply for the project.

The Selection Process

Six researchers submitted proposals. After a rigorous vetting and interview process, the Community Scorecard committee selected the Steinhardt Social Research Institute at the Cohen Center of Modern Jewish Studies (CMJS) out of Brandeis University as the principal researchers for the project.

CMJS has extensive experience conducting Community Studies, including in such major cities as Boston and Seattle. Further, they tackle a major obstacle that arises within the methodology using an innovative, yet statistically-sound strategy.

The traditional methodology used to conduct studies is by combining the Federation’s database with Random Digit Dialing (RDD). Using RDD, a call center calls random phone numbers in a given area code until they reach enough respondents who are a) Jewish and b) willing to answer a ~30-minute survey.

As a reference point, in 2002, when the Pittsburgh Federation last conducted a Community Study, 278,890 phone calls were made to 93,840 phone numbers using RDD. The total number of completed surveys from Jewish homes? 341. An average of almost 818 dialings were made in order get ONE usable Jewish household interview.

Since 2002, response rates to surveys (regardless of subject matter) across the nation have plummeted. The invention of caller-id has not benefited calling centers in any way.

Further, because RDD only factors in local area codes, this methodology does not account for 21st Century cell-phone users who a) retain their phone numbers, regardless of their current residence and b) are much less likely to own a landline. In other words, there is a significant likelihood that someone under the age of 40 who has recently moved to Pittsburgh does NOT possess a 412 or 724 area code.

Not only is RDD expensive, it is also inefficient. While several other proposals contained strategies to account for this missing demographic in their studies, CMJS provided a comprehensive, tested approach that swayed the Committee.

Their approach uses a combination of four different methodologies in the study:

  1. Enhanced RDD – CMJS purchases and synthesizes hundreds of nationally-conducted surveys that include questions both about geography and religious identity. The synthesis uses the data from these surveys (along with other sources) to estimate the size of the Jewish population in Pittsburgh. Because the surveys are already completed, purchasing the data is significantly more cost-effective than conducting the studies from the start.
  2. Comprehensive List-Based Sample – Instead of simply using a Federation list, CMJS collected the lists of nearly 100 Jewish organizations throughout the community, including synagogues and other membership-based organizations (like AIPAC, JStreet, Hadassah, and the Jewish Chronicle). Their research suggests that roughly 75% of the entire Jewish community could be found on at least one of those lists.
  3. Ethnic Name Sample – CMJS purchased a list from a market research firm that develops a database of religious and ethnic minority groups based on distinctive ethnic names and consumer behavior. When combined with the organizational lists, and after being deduplicated, this sample accounts for any Jewish person who is not on an organizational list. (CMJS has conducted several studies that prove that there is no statistical difference on Jewish attitudes, affiliations, or behaviors between Jewish distinctive Jewish surnames and Jewish who do not have such names.)
  4. Multiple Survey Modes – CMJS approaches survey respondents by postal mail, phone, and email. Multiple attempts are made to contact each respondent.

The First Stages of the Study

Once we selected CMJS, we began to mobilize the community in order to assemble a survey that provided the most useful data for organizations’ planning processes. We held meetings with key stakeholders – organizational leaders, both pro and lay – to learn what topics they would like to cover in this survey.

These meetings played a significant role in shaping the survey. Because of the input we received from communal leaders, we included questions about minority groups like the LGBTQ population and native Israelis living in Pittsburgh. We included questions about mental health services. We placed a higher premium on learning about the financial barriers to Jewish life.

At this point, the endeavor became a true communal effort.

Immediate Learning Opportunities

The survey hit the field in April 2017. While in the field, we learned several important things about our community.

  • The lists we received from the organizations throughout the community were not in good shape.

CMJS tallied a far greater number of disconnected/wrong numbers than they were used to. A lesson that organizations should invest in cleaning their databases, which will drastically impact the deliverability of their messages.

  • The lists we received from the Ethnic Names Sample contained a far greater number of non-Jews than the Cohen Center was used to.

This was likely due to two reasons:

    1. Because we collected so many organizational lists, the people who showed up on the Ethnic Names Sample were the lower-confidence individuals. In other words, most people who had a high likelihood of being Jewish were already on one of the organizational lists.
    2. Pittsburgh has a disproportionate number of German/Polish last names, which could be confused with a Jewish name. (Think Roethilsberger.)
  • The respondents that we did indeed reach and were indeed Jewish had a higher likelihood of participation than the Cohen Center was used to.

People in Pittsburgh were anxious to participate in this survey. The challenge was reaching them.

The Responses

In August 2017, we reached our goal of 2,000 completed surveys. Those 2,000 surveys are divided into two categories:

  1. ~1,200 are in the primary sample, developed from a stratified, random list. These are the responses used to represent our community as a whole. They paint a picture of the composition of the community.
  2. ~800 are in the secondary sample, developed from an open call to participate. Because people in this sample are more likely to be engaged members of the community, this sample is only used to learn more about subpopulations. It is NOT used to draw a portrait of the community.

The Weighting Process

Over the subsequent months, CMJS weighted the data to ensure it accurately reflected the community. Sources of “known” information (like the Community Scorecard, Census data, and the 2013 Pew Study) were used to double check our work. For instance, if the survey responses indicated that there were 2,000 students enrolled in local Jewish day schools, the Cohen Center knew to adjust the weights (because there are 800 students enrolled).

A system of checks and balances was put in place to ensure that there were no aberrations.

Community Input

In October 2017, we developed an executive summary of the preliminary findings of the study. Though the CMJS continued to weight and parse the data, we had enough information to begin sharing it with organizational stakeholders. We had two purposes in mind: a) does this comport with your understanding of the community? b) what would you like to see in the final report that will inform your decision-making processes?

We have continued to host those meetings and conversations, providing constant feedback to CMJS.

Now, in February 2017, we are proud to publish the final report. We will continue to understand the data that we have collected and simultaneously turn our attention to the follow-up studies.

Because now that we have finished the hard part, it’s time for the hard part.