Conducting a Culturally-Relevant Evaluation for your Grants
Kavita Mitapalli, Ph.D., MN Associates, VA
Let me begin with a note of caution. I am not an expert in developing social or cultural theories and/or frameworks. However, as an evaluation practitioner for over 15 years in the United States and having worked in the field with a very wide range of stakeholders within the K-20 sector across the country, I have learned a thing or two on how to ground an evaluation project that is inclusive, culturally-relevant, meaningful, and yields in useful and actionable results. In this article, I share a few thoughts mostly gleaned from my readings, useful resources on the topic, and some shared experiences from the field.
Defining culture and cultural competence
Culture is a cumulative body of learned, shared, and lived behavior, values, customs, and beliefs common to a particular group or society. In essence, culture makes us who we are as humans . It can be described as the socially transmitted pattern of beliefs, values, and actions shared by groups of people.
With respect to evaluation, it affects everything from how a person with limited English proficiency understands and accesses consent forms or understands survey or interview questions, to the appropriateness of survey or interview questions related to the study itself, to the format and context in which data and results are analyzed and presented.
How can evaluation be culturally responsive?
An evaluation is considered culturally responsive when it takes into account the culture of the program, its context, and its people from the beginning. In other words, the evaluation is based on an examination of local mechanisms and (intended) effects/impacts through lenses in which the culture of the participants is considered an important factor. A culturally responsive evaluation also rejects the notion that program assessments must be only objective and culture free, if they are to be unbiased.
An evaluator who conducts a culturally relevant evaluation honors the cultural context in which an evaluation takes place by bringing together the shared lived experiences and common understanding of the people and the places to the evaluation tasks at hand. The evaluation is holistic in nature. Never linear and always context-bound.
Culturally-Relevant Evaluation Framework
One of the most commonly used CRE framework in our current work so far is the one developed by evaluation experts such Hood, Hopson, Kirkhart (2015), Frierson et al (2010) and Hopson (2009) .
According to the scholars, CRE is a holistic lens/framework for centering evaluation in a culture that recognizes that culturally defined values and beliefs lie at the heart of any evaluation work. (Figure).
CRE facilitates deeper conversations, engagement, learning, and real-time continuous development and feedback for data collection, analyses, and reporting that allows open dialogue and improvement. While a traditional data collection process requires the evaluator to position as an outsider to assure independence and objectivity, the CRE approach positions the evaluator as a member of the project planning and implementation team who is integrated into the process of understanding, gathering, and interpreting local context, data, framing issues, and surfacing and testing model developments.
As an evaluator, in addition to using the CRE framework while conducting an evaluation or before that developing a proposal/ responding to an RFP has made them much stronger and grounded. One of the other items that my team and I have found very useful and relevant to our practice is using a short checklist  (below) with some adaptations.
|Culturally-Relevant Evaluation Checklist: Task/Activity/Component
|I include and engage the project’s stakeholders in evaluation planning process(es)
|I use a variety of sources to learn about the cultural heritage and context of project and its people
|I seek information to better understand the cultural context of a program and its people
|At all stages of an evaluation, I examine and reflect the potential impact of my culture, biases, stereotypes around race, ethnicity, gender, socioeconomic status, and others
|I pay attention to the similarities and differences of life experiences
between the evaluation team and members of the target population, and
consider how those dynamics might impact the evaluation
|I select or adapt data-collection instruments (i.e., surveys, interview protocols, etc.) to ensure appropriateness for the culture(s) of the people of whom the questions are being asked. I also collaborate with the project team/stakeholders to develop the tools and instruments
|Data-collection activities that require interaction with community
members, consumers, and stakeholders are led by the me/team members who
are best suited to understand the specific cultural context, based on factors such as shared/lived experiences with the target population, knowledge of the target population, and awareness of biases.
|In designing data-collection and analysis plans for answering questions about how the program/project/initiative was implemented, I pay attention to…
…the extent of shared experiences between members of the recipients of the program’s services.
…diversity (including demographics and cultural background) of program staff.
…hierarchical dynamics between and among the staff that have the potential to impact project success and evaluation outcomes (Power/privilege
…the organization’s historical stance and/or practice related to issues of equity
…community context and dynamics and makeup of the community.
|I assess whether local demographics, socioeconomic factors, cultural factors, and other attributes of the community played a role in the process to define program goals and objectives.
|In analyzing and interpreting outcome data, I disaggregate data along demographics to identify and assess the extent of differential impacts of the program.
As our society becomes increasingly diverse racially, ethnically, and linguistically, it is pivotal that program planners/designers, implementers, researchers, and evaluators fully understand, appreciate, and employ cultural contexts in which these programs exist and operate to their work on a
daily basis. To even ignore or nullify the reality of the existence of culture and to be agnostic or unresponsive to the needs of the population within their ecosystems is do the people a disservice, the program in danger of being ineffective, and to put the evaluation in a state of being seriously flawed and meaningless.
 Hood, S., Hopson, R.K., & Kirkhart, K.E. (2015). Culturally-Responsive Evaluation: Theory, Practice, and Future Implications. Handbook of Practical Program Evaluation. 4th Ed.
Frierson, H. T., Hood, S., Hughes, G. B., and Thomas, V. G. “A Guide to Conducting Culturally Responsive Evaluations.” In J. Frechtling (ed.), The 2010 User-Friendly Handbook for Project Evaluation (pp. 75–96). Arlington, VA: National Science Foundation, 2010.
Hopson, R. K. “Reclaiming Knowledge at the Margins: Culturally Responsive Evaluation in the Current Evaluation Moment.” In K. Ryan and J. B. Cousins (eds.), The SAGE International Handbook of Educational Evaluation (pp. 429–446). Thousand Oaks, CA: Sage, 2009.
 Developed by Public Policy Associates and cross listed by http://jordaninstituteforfamilies.org/wp-content/uploads/2018/06/Self-Assessment_6-pages.pdf
NA is a small, woman-owned education research and evaluation firm in Northern Virginia. MNA is headed by Kavita Mittapalli, Ph.D., who brings over 18 years of experience in conducting R & E work for various programs and initiatives across the country. Kavita worked at various consulting firms before founding MNA in 2004. She started her career in Agricultural Sciences before becoming an Applied Sociologist and a mixed methodologist with an interest in research design in the field of education. She brings her multi-disciplinary skills and knowledge to all the work she does at MNA. She is supported by four team members who bring their very diverse backgrounds, academic training, and professional experiences to MNA. To date, MNA has evaluated 31 NSF grants in various tracks in addition to evaluating medium to large grants funded by other agencies (e.g., USDE, DOL, NASA, DODEA, and DOT). Kavita can be reached at Kavita@mnassociatesinc.com. Connect with her on LinkedIn (linkedin.com/in/kavitamittapalli) and on Twitter @KavitaMNA.