Join us on Facebook 10th Anniversary!
Special Focus Issue
Add Comment

Featured Articles from Around the Globe

Reflections on the Assessment of Practice Competencies Competencies Community Psychology, Liberation Psychology, and Ecopsychology Specialization, M.A./PhD Depth Psychology Program, Pacifica Graduate Institute

Reflections on the Assessment of Practice Competencies Competencies Community Psychology, Liberation Psychology, and Ecopsychology Specialization,  M.A./PhD Depth Psychology Program, Pacifica Graduate Institute by  Nuria Ciofalo, Susan James, Mary Watkins

Author(s): Nuria Ciofalo, Susan James, Mary Watkins

Abstract:

The framework of the Comprehensive Assessment Plan of the Community Psychology, Liberation Psychology, and Ecopsychology Specialization of the M.A./Ph.D. Depth Psychology Program is guided by the values and goals set forth in Pacifica Graduate Institute’s Strategic Plan. The Practice Competencies designed by the Community Psychology Practice Council and the Council for Education Programs of the Society for Community Research and Action (SCRA – Division 27, American Psychological Association) were integrated into the development of a rubric that includes particular competencies that are based on depth psychological and ecopsychological abilities, capacities, and sensitivities nurtured in our values and curriculum. This rubric was applied to praxis courses that are specifically oriented to the application of knowledge and competencies earned in theoretical courses. Findings and reflections obtained from the application of the adapted rubric and narrative assessments were combined with the analysis of portfolios and video-documentation as testimonies of attained practice competencies.


Article:

Download the PDF version for full article, including all tables, figures, and appendices

 

The Assessment Framework

The Community Psychology, Liberation Psychology, and Ecopsychology (CLE) Specialization of the M.A./Ph.D. Depth Psychology Program at the Pacifica Graduate Institute has concluded its 6th year, and is actively evolving a participatory assessment and implementation plan that reflects its integrative values and goals. Our curriculum strives to foster community psychology competencies, as well as depth psychological, ecopsychological, and racial and environmental justice sensitivities and competencies.

This ambitious framework requires ongoing faculty and student efforts at articulating our mutual pedagogical goals and developing ways to assess our degrees for opportunities and challenges in achieving them. We are working both within the institutional assessment efforts of Pacifica Graduate Institute and within the “Guiding Principles and Competencies for Community Psychology Practice” set forth by the Society for Community Research and Action (SCRA – Division 27, American Psychological Association), and developed by the SCRA Community Psychology Practice Council and the SCRA Council for Education Programs (Elias et al., 2015, pp. 35-53; Sarkisian et al, 2013; SCRA, 2012). Furthermore, we have created additional assessment criteria developed primarily for the classroom from work that has recently evolved out of Students of Color and Racial Justice Allies groups, as well as core values specific to our specialization that were developed with faculty involvement as a means to a values-driven pedagogy (Taylor & Sarkisian, 2011).

The framework of Pacifica Graduate Institute’s (Pacifica) Comprehensive Assessment Plan (CAP) is guided by Pacifica Graduate Institute Core Values: Stewardship, Integrity, Service, Eros, and Consciousness (http://www.pacifica.edu/about-pacifica/core-values?highlight=WyJjb3JlIiwidmFsdWVzIiwiY29yZSB2YWx1ZXMiXQ==).

In addition, this plan focuses on the attainment of the following goals:

  1. strengthen and enhance academic rigor and vigor across all programs;
  2. develop institutional assessment for improved planning and intervention;
  3. continued assessment for organizational learning, and
  4. resourcing of infrastructure to provide academic excellence

In order to be an integral and central part of organizational learning, assessment must go beyond compliance and invoke values of reflection, learning, commitment, and community, all of which are deep-seated in Pacifica’s mission and values. As such, one of the most important objectives of the CAP is to foster shared responsibility for and commitment to ongoing assessment and learning (Fetterman et al, 2014; Fetterman & Wandersman, 2015).

Learning and assessment are a multi-faceted, multi-layered, cyclical, and systemic process with multiple, interdependent, and interacting feedback loops. The plan involves processes and structures working simultaneously on five related but distinct levels of learning and assessment: student, course, program, organization, and community—located both within each and across Academic Programs and embedded in the institution’s ecological context. Assessment is conceived as a process of participatory learning, knowledge creation, and knowledge management, a process that is embraced by the organization and that is transformed into a reflective, collaborative, action-oriented, and incremental approach to improved program planning and implementation. This includes reflexive learning that fosters continuous, transformative, and sustainable educational excellence.

The Assessment Strategy

This CAP strategy strives to create a solid and sustainable process of assessment ownership and buy-in. It is highly aligned with Pacifica’s ways of knowing and utilization of learning for personal, social, and educational change, thus making it culturally compatible and increasing the probability of promoting its sustainability. Further, it aims to facilitate the ecological/holistic growth of individuals, organizations, and communities, which are capable of their own self-transformation. It emphasizes the need to value existing individual and organizational knowledge and to practice it in community, incrementally, and cautiously. It is based on deep reflection of expected outcomes and the critical evaluation of its impacts to adapt knowledge for continuous self-driven, transformational learning.

Components of the CAP

The CAP has four components to assess effective educational excellence:

  1. Student Success
  2. Academic Rigor and Effectiveness
  3. Centralized Data Collection and Management System
  4. Closing the Loop: Informed Decision-Making

Each component has interdependent and interactive factors or subcomponents that enable or disenable the application of knowledge for program improvement. The plan warrants the assessment of such factors to plan for interventions. In this paper, we will focus on reflections of Program Learning Outcomes as measured by an adapted CLE-SCRA Practice Competencies rubric. We will share the additional competencies offered by our community and ecological fieldwork and research faculty advisors. Lastly, we will briefly describe initial efforts at assessing competencies based on e-portfolios and video-documentation.

The M.A/PhD Depth Psychology Program, Community Psychology, Liberation Psychology, and Ecopsychology Specialization (CLE) at Pacifica Graduate Institute

This specialization is a bold initiative to forge interdisciplinary transformative approaches to the personal, community, cultural, and ecological challenges of our time. While grounding students in psychoanalytic, Jungian, archetypal, and phenomenological lineages of depth psychology, Euro-American depth psychological theories and practices are placed in dynamic dialogue with ecopsychology, decolonial studies, critical community psychology, and indigenous and liberation psychologies. To study community and ecopsychology in the light of liberation psychology is to commit to the exploration of the profound effects of injustice, violence, and the exploitation of others and nature on psychological, communal, and ecological well-being.  It is a commitment to create paths to peace and reconciliation, justice, and sustainability (https://www.pacifica.edu/degree-programs/ma-phd-community-psychology-liberation-psychology-ecopsychology ).

Depth Psychological Community/Ecological Fieldwork and Research

The M.A./Ph.D. Depth Psychology Program, CLE Specialization, is committed to a holistic understanding of psychological well-being, seeing individual, familial, community, environmental, and cultural well-being as inextricably interlinked. Through community and ecological fieldwork and research, students work in the area of their calling, while deepening their ethical discernment, reflecting on their own positionality, widening their repertoire of dialogue and arts-based approaches, and gathering the theoretical insight and practical skills to conduct participatory action research and community and organizational program evaluation. Praxis classes mentor students in innovative group approaches: council/circle, appreciative inquiry, theater of the oppressed, public conversation, open space technology, community dreamwork, liberation arts, restorative justice, somatic approaches to trauma healing, conflict transformation, and imaginal and ritual approaches to community health and healing.

To hold in mind the intricate workings of psyche in the context of the complex dynamics of culture and history is a difficult undertaking. In this specialization, we also draw on theories, insights, and practices from critical community psychology, liberation psychology, ecopsychology, and indigenous psychologies. We are keenly aware of the importance of dialogical and interdependent relationships, consensus building, and the arts to developing critical resistance. In the fieldwork portion of this specialization, each student, in conversation with a fieldwork mentor, discerns the area(s) of their passionate interest, and engages in two fieldwork immersions (one each summer) to deepen their understanding of work being pursued in the area of their interest and to contribute to the ongoing work of their setting. During the second fieldwork experience, students are encouraged to engage in a piece of research, hopefully of a participatory nature, applying ethics and principles of research, and contributing to addressing an issue or concern which the community and they think is important to study. This work enables students to contribute and to hone research and program evaluation skills, including asset mapping and appreciative inquiry. Students join in the work of ongoing community groups (see fieldwork manual).

The Assessment Approach of Fieldwork Practice Competencies

Methods

CLE’s assessment plan is implemented based on a multi-faceted, multi-layered, cyclical, and systemic process with multiple interdependent and interacting feedback loops. It involves assessment processes of structures working simultaneously on six related but distinct levels of learning, (1) student, (2) faculty, (3) course, (4) program, (5) organizational, and (6) community praxis (i.e. transformative practice) located both within each and across our specialization areas, including its ecological context (local and global community).

The eighteen SCRA Practice Competencies were integrated into the development of a rubric that includes particular competencies based on sixteen depth psychological and seven ecopsychological capacities and sensitivities nurtured in our values and curriculum. The rubric was applied to the assessment of students’ competencies in several praxes courses. Qualitative assessment included the reflection of practice competencies in student essays, stories published in newsletters, e-portfolios, and testimonies by means of narrative inquiry and video documentation.

Participants

The co-researchers in this study were three fieldwork advisors, two class instructors, and fifty students engaged in assessing practice competencies during the 2012-2016 academic years. Seventeen students in 2012, eleven students in 2013, and ten students in 2014 who were enrolled in the summer Fieldwork I course and under the “Phenomenology and Communication of Depth Psychological, Cultural, and Ecological Work” class were assessed by one instructors and three advisors applying a 45-item rubric in 2012-2013, and a revised 42-item rubric in 2014. In 2015, seven students engaged in these classes, five students who were enrolled in Fieldwork II and in a third year “Participatory Research Practicum” class, their students’ peers, three fieldwork advisors, and two class instructors were engaged in this assessment applying the revised 42-item rubric. Lastly, three core faculty conducted reflections on findings throughout these years.

Materials and Procedures

The list of the 18 SCRA Practice Competencies was slightly modified by separating item 12 into two items: 12: “Collaboration” and 13: “Coalition Development.” The last adapted CLE-SCRA rubric has a total of 19 SCRA items, and the additional 23 CLE items—16 depth psychological and 7 ecopsychological competencies (see Appendix B). The rationale was that building collaborations is an important practice competency that helps to develop more complex competencies such as Coalition Development. We considered that our curriculum may most likely provide skills in the former, particularly, at the Master’s level, and that the latter may be obtained at the PhD level—i.e., in the third-year classes and during dissertation research. With this rationale in mind, we reflected on skills developed in the first year of fieldwork compared with second year fieldwork and third year coursework.

The CLE-SCRA Practice Competencies rubric scale included three levels of mastery:

(1) Exposure: In core community courses, all students learn about the value of this competency and how it can be applied in community psychology practice.

(2) Experience: In selected courses, including supervised fieldwork, students can choose to gain supervised practice in performing tasks and actions related to the competency.

(3) Expertise: Upper level students can choose competencies in which to develop further experience and attain a higher level of expertise. This might involve several field experiences over several terms or years. Postgraduate experiences and continuing education allow further development of expertise in specific competencies. (SCRA, 2012, p. 9 cited in Sarkisian et al., 2013, p. 2)

Not Applicable (N/A): We added this column to the rubric in 2014 to indicate that the competency was not evidenced because the goals of the fieldwork project did not address it.

The rubric was first created in 2012 with a total of 45 items and revised in 2014 with a total of 42 items (see Appendix A). It was applied in the four praxes courses described above (see curriculum https://www.pacifica.edu/degree-programs/ma-phd-community-psychology-liberation-psychology-ecopsychology/cle-curriculum-overview). Data obtained from various sources reporting on fieldwork competencies was used in this assessment and applied at different levels:

  1. fieldwork advisors’ assessment of the students’ practice competencies evidenced in fieldwork reports utilizing the adapted CLE-SCRA rubric;
  2. fieldwork advisors’ overall narrative feedback;
  3. students’ peer assessments of fieldwork presentations applying the rubric in two classes;
  4. faculty’s narrative analysis of yearly newsletters, students’ portfolios, and a video documenting testimonies of students’ learning outcomes, and
  5. self-evaluation of third year students’ reflection essays on their perceptions of attainment of practice competencies applying the rubric.

Quantitative data was analyzed using software to determine means. Qualitative data obtained by means of students’ self-reflections was analyzed using content analysis (manual NViVo analysis), determining ranks based on frequencies of competencies cited under the rubric domains. In addition, narrative analysis of newsletter contents featuring student fieldwork was conducted applying the rubric based on faculty reflection. Students were also invited to create e-portfolios and showcase their work as a tool of learning outcome assessment as well as career search. One student, who was part of our first-year cohort in 2012 when our specialization started, developed an e-portfolio that was used as a case study to reflect on the attainment of practice competencies. Lastly, a video documenting testimonies of students was included in the analysis. Data was triangulated to determine overall findings.

Quantitative Findings

Fieldwork Practice Competencies Assessment in 2012 and 2013

Seventeen students in 2012 and eleven students in 2013, who were enrolled in the summer Fieldwork I course and the Phenomenology and Presentation of Depth Psychological, Cultural, and Ecological Work course were assessed on 45 Competencies for Community Psychology Practice. In 2012 and 2013, three fieldwork advisors assessed the students’ fieldwork reports. In addition, in 2013 the fieldwork coordinator and instructor of this class assessed the students’ presentations applying the rubric. Figure 1 displays the range of the average scores for the 45 competencies combining years 2012 and 2013.

Figure 1: 2012 and 2013 Fieldwork Practice Competencies (See PDF version or image below)

The number of times a practice competency was reported is described inside the bars. Overall, the competency score averages were much higher in 2013 than 2012. In 2012, the competency scores averaged 2.2 and in 2013, 2.7. In 2012, only four of the 45 competencies had averages 2.5 or greater; in 2013, only 10 of the 45 competencies had averages below 2.5. In 2013, no competency had an average score less than 2 but in 2012, 9 of the 45 competencies yielded average scores lower than 2. The low-rated (average scores between 1.7 and 1.9) competencies were: 10 (Resource Development); 11 (Consultation & Organizational Development); 18 (Participatory Community Research);

30 (Critical awareness of ethnocentric tendencies through language, tone, and expression of entitlement – including subtleties); 33 (Critical awareness of connection between exoticism and racism), and 34 (Understanding of differences among invasive, participatory, respectful and trusting research and action process including but not limited to recording, filming and photography when developing a culturally sensible research protocol in interaction with community).  In addition, 16 (Public Policy Analysis, Development & Advocacy) and 19 (Program Evaluation) received even lower average scores between 1.5 and 1.7. The rubric with item/competencies descriptions and their averages for the two years is displayed in Appendix A.

What was striking about this data set was the large amount of missing data. The 2012 students averaged 28 evaluated competencies and 17 unevaluated competencies. The 2013 students averaged 18 evaluated competencies and 27 unevaluated competencies.

Faculty reflections in the 2012 and 2013 retreats

Faculty retreats were used to reflect on findings, particularly, on the average to less than average scores (items 10, 11, 18, 30, 33, and 34). Resource development, consultation, and organizational development (10 and 11) are taught in courses placed at the end of the second year and in several other courses spread over the third-year curriculum. Fieldwork presentations take place at the beginning of the second year, consequently, it is expected that these did not evidence application of these competencies. Further, students had not yet been exposed to a series of content areas that could have given them the opportunity to develop competencies in intercultural research (items 30, 33, and 34). In addition, items 16 (public policy) and 17 (dissemination to build public awareness) only ranked average. We decided to review our curriculum and planned for integration of course content in these areas, as well as to expand international research opportunities.

Among the highest-ranking competencies were those that dealt with ecological principles, empowerment, sociocultural competency, reflexive practice, small group processes, collaboration, self-reflexivity, building community capacity, legitimizing other ways of knowing, and various depth psychological sensibilities. The suggested findings appeared to be in alignment with the salience of the course contents and the kind of learning objectives emphasized in our curriculum. Further, there has been an improvement in the degree of exposure to several practice competencies between year 2012 and 2013. We concluded that the fact that many of the added competencies were not ranked (left blank) could be an indication that they were not applicable to the goals of the fieldwork project. This fact also suggested the need to revise and reduce the items and to add an (N/A) rating. We invited fieldwork advisors to collaborate in the revision of the rubric.

Fieldwork Practice Competencies in 2014

In 2014, we added “Not Applicable” N/A due to the high number of missing data in the 2012-2013 evaluation. We revised the rubric eliminating items that had a high number of missing data scores.  Missing data may have occurred when evaluators did not rank a competency because it was not evidenced in the student's work. The evaluators (faculty and students) were instructed to mark N/A if the competence was not relevant to the topic of the fieldwork—instead of not ranking the competency at any level given it was not evidenced in their fieldwork reports or presentations.  N/A data may indicate that: (1) the student's work did not evidence the application of the competency because the work did not require it (i.e., program development given the student's work is not addressing this as a goal of the project) or (2) the student did not use this competency because it was not yet delivered as part of the curriculum and at the level he/she was (i.e., first or second year). N/A scores were aggregated averaging student-by-competency ratings.

We engaged fieldwork advisors in the development of a revised rubric with a total of 42 items. One instructor and the students’ peers rated the students’ fieldwork presentations in the second-year class. Ten students received about 10 “Expertise” assessments, 16 “Experience” assessments, and 16 “Not Applicable (N/A)” assessments out of a total 42 possible assessments. Appendix B describes the average scores for each item. There was again missing data. Ten students were evaluated on 42 items and from these assessments, only 62% were actually recorded. Of the 42 competencies assessed, 17 (38%) yielded N/A 50% or more of the time (corresponding to a number of 5 or more in the column labeled N/A).

Faculty reflections in the 2014 retreat

Almost all competencies had an average score of 2 or above with the exception of program evaluation under the community research competencies (see Appendix B: item 19) and one under the depth psychological sensitivities (item 26). We recognized that students are exposed to foundations in qualitative research methodologies during the spring quarter of the first year, just before they embark on their Fieldwork I in the summer. The Fieldwork Handbook instructs them to apply depth psychological sensitivities in their first fieldwork experience and to listen to the community voices without imposing their own research questions and intervention agendas. Students are encouraged to apply research methodologies in their second fieldwork in a participatory and dialogical manner, exploring with community members’ questions of mutual interest and import. Further, students are not exposed to evaluation methodologies before the spring quarter of their second year and just before their Fieldwork II. Since these data represent assessment of Fieldwork I practice competencies, it is not surprising that these areas ranked low. As a result, we decided to conduct assessment of Fieldwork II.

Competencies that received average scores of 3 (expertise) are within the ecopsychological competencies (items 38; 40; 41, and 42) and one item (35) under the depth psychological sensitivities. These findings suggest that the 2014 curriculum allowed students to practice ecopsychological competencies that may have been reinforced in more than the one ecopsychology course offered in the first year.

Fieldwork Practice Competencies in 2015

During the 2015-2016 academic years, we engaged seven students involved in their first-year fieldwork, and five students involved in their second-year fieldwork who were also engaged in fieldwork presentation in two classes (2nd and 3rd year respectively). In addition, CLE faculty reflected on findings and proposed both assessment improvements and curriculum review to strengthen praxis competencies in students

The rubric was applied at different levels: (1) three fieldwork advisors assessed the fieldwork reports; (2) the two instructors of the courses in which students presented their fieldwork assessed the student products, and (3) students acted as peer reviewers and assessed their fieldwork presentations. Ratings for all the competencies were given for each student by fieldwork advisors, faculty, and student peers. Ratings were made on a scale ranging from 1 (lowest) to 3 (highest) whereas: (1) is Exposure, (2) Experience, and (3) Expertise in the attainment of praxes learning outcomes. Table 1 describes the findings.

Table 1: 2015 Comparison of Scores between Fieldwork I (FI) and Fieldwork II (FII) (See PDF version or image below)

For each of the 5 Fieldwork II students there was one rating by a faculty and between 3 and 5 peer ratings. For each of the 7 Fieldwork I students there were between 1 and 3 ratings by fieldwork advisors and one faculty, and between 9 and 11 ratings by student peers. For each student, the advisor/faculty ratings and the student peer ratings under each competency were added and averaged. There were some differences between the ratings given by advisors, faculty, and the peers. However, differences were minimal. Both scores (those given by faculty/advisors and the peer scores) were averaged to determine the overall score for each competency domain and competency. Detailed scores for each of the competencies under Fieldwork I and II are reported in Appendix C.

Findings suggest that practice competencies perceived by advisors and students did not improve considerably from those rated under Fieldwork I to those rated in Fieldwork II. However, there are slight average improvements in Fieldwork II in all of the competencies except in the last two competencies under the ecopsychological competencies domain and the community research domain (see Appendix C items 41,42 and 16)).  The competency domain Community Research shows a low score under Fieldwork I competency program evaluation (1.9) but an overall increased average in Fieldwork II. The last two competencies (41 and 42) of the Ecopsychology domain had an average of 1.9 in Fieldwork I compared with 2.6 in Fieldwork II, and 2.1 in Fieldwork I compared with 1.95 in Fieldwork II respectively, suggesting a decrease of the last competency (42) in Fieldwork II However, in Table 1 above, the overall score of the community research domain improved slightly in Fieldwork II (1.97 in Fieldwork I and 2.62 in Fieldwork II). The overall score of the ecopsychology domain also showed a slight improvement (2.3 in Fieldwork I and 2.59 in Fieldwork II). Furthermore, both faculty and students perceived there was some curricular exposure to the practice of competencies listed under the two domains (research and ecopsychology) as evidenced in the students’ selected fieldwork reports and presentations.  

It is important to note that at the time of assessment Fieldwork I students were exposed to only one ecopsychology and one foundations in research class—although research and ecopsychology are integrated into other classes. Fieldwork II students take second year classes specifically targeting the areas of ecopsychology and program evaluation in the spring quarter, (and in the third year fall quarter, after their fieldwork, a Participatory Research Practicum). This may explain the lower scores under these two domains in the Fieldwork I findings.

Faculty reflections in the 2015-2016 retreat

Overall, it is expected that after Fieldwork II students may gain more mastery in these competencies. We will follow up assessing practice competencies during the dissertation and writing phase to determine if these are strengthened at level 3 (expertise). However, as Table I above suggests, the fact that 6 practice competencies’ domains were assessed at the experience level (2), one at the exposure level (community research under Fieldwork I), and none at the expertise level (3) alerts us that we need to continue reviewing our curriculum and strengthening these competencies.

We decided to expand this assessment with potential qualitative evidence of practice competencies through a content analysis of yearly newsletters, e-portfolios, and video-documentation in which students showcase their fieldwork, recent publications, recent jobs and positions, recent grants, and scholarships.

Qualitative Findings

Student Reflective Essays

The analysis of 8 third-year student reflections applying the rubric during the Participatory Research Practicum class in the winter quarter of 2015 was used to train students how to do content analysis of the gathered testimonies. The instructor analyzed the stories applying manual NViVo procedures utilizing the rubric domains to code the narrative data. Codes were ranked based on frequencies. Table 2 describes the findings. The Depth Psychological competencies’ domain was reported most frequently (31 times). The second SCRA practice competencies’ domain most frequently cited was Foundational Principles with a total score of 23. The data suggests that students perceived the curriculum is mostly developing and strengthening depth psychological competencies, primarily, “their ability to listen to unconscious dimensions of individuals and communities,” and “the capacity to listen into the margins of a situation, to those voices that have been marginalized, repressed, or silenced” (see rubric Appendix B, DP items 20 and 31 respectively).

Table 2: Frequency of Practice Competencies’ Domains in Self-reflection Essays (See PDF version or image below)

In the quantitative analysis, this competencies’ domain received a score of 2.6 in Fieldwork I and 2.7 in Fieldwork II. Students also reported sociocultural and cross-cultural competencies as being cultivated in their studies and in their application in fieldwork settings (cited 9 times; rank 3).

The fourth most frequently cited (7 times) competencies’ domain was Ecopsychological Competencies, particularly, “the capacity to listen to the built and natural environments” (EP item 36) and “care to pay attention to the history (natural and cultural) of a particular place under consideration” (EP item 37) (see Appendix C). However, this competencies’ domain scored very low in the quantitative assessment—1.1 and 1.5 for Fieldwork I and II respectively. The less frequently cited competencies earned in the specialization were reported under Community Research (CR cited only 4 times); Community and Organizational Capacity Building (CAB cited only 2 times), and Community and/or Program Development (CD/PD cited only 2 times). It is important to note that Community Research also received a low score (1.8) under the quantitative data analysis. Although findings are tentative given the small number (8) of student participants, results suggest the need to strengthen the latter practice competencies at least within this cohort. Other qualitative data sources were used to determine if this perception is representative of all CLE students. For this purpose, student work reported in our yearly newsletters, one e-portfolio, and a video documenting testimonies of perceived learning and praxis outcomes were used to complement the qualitative assessment.

Analysis of the Newsletters

Since 2013, our specialization has published yearly newsletters for which we select student portfolios of work conducted during their fieldwork, and/or with community groups or organizations during the life of their studies. The selected themes of the students’ articles reporting on lessons learned from their fieldwork or continued community work were coded applying the rubric. It is important to note that this assessment is not representative of all the students’ work but was guided by pre-determined faculty criteria to select stories that may inspire and serve as models originating within the specialization. Thus, the assessment already includes bias towards determining particular practice competencies. Nonetheless, we found this approach useful for us to reflect on manifestations of praxis competencies and what we considered important. Table 3 shows the frequency of competencies’ domains by publication year. The practice competencies’ domains that appeared most frequently reported in the students’ fieldwork stories published in the newsletters from 2103-2016 are:

  1. Depth Psychological Competencies (46),
  2. Foundational Principles (40)
  3. Social Change (33)

Table 3: Frequencies of Practice Competencies’ Domain Reported in the Narrative Analysis of Newsletters (See PDF version or image below)

These practice competencies are also among those most frequently cited or reported under the previous quantitative and qualitative analysis suggesting alignment of findings among different data sources. In order to strengthen our participatory self-assessment, we would like to hear from our readers and colleagues the kinds of practice competencies they perceive after reading our newsletters (https://www.pacifica.edu/degree-programs/ma-phd-community-psychology-liberation-psychology-ecopsychology/cle-news-and-events?highlight=WyJuZXdzbGV0dGVyIl0=)

Example of a Student e-Portfolio

One of our fifth-year dissertation students, Shelly Stratton, developed an e-portfolio in preparation for a Practice Competencies presentation at the 2013 SCRA Biennial Conference. The e-portfolio purports to evidence how practice competencies contained in the rubric are evidenced by the student’s work. We also viewed this approach as beneficial not only for program review purposes but for the students as e-portfolios can be used for career search, such as sending a resume when applying for jobs or for career promotion placing it in a personal website. The e-portfolio was presented at this conference and feedback from participants and presenters was collected to improve our assessment approaches and tools (see https://pacifica.digication.com/shelly_stratton_licsw2/Welcome/published). We would like to invite our readers and colleagues to assess the evidence of competencies perceived in this student’s product and share your feedback with us.

Video-documentation

In 2014 we created a video to share our curriculum and selected student work. The following link will guide you in determining the presence of practice competencies reported in the students’ testimonies. We look forward to receiving the appraisal of your assessment in order to expand dialogue about improving and strengthening our students’ practice competencies. (https://www.youtube.com/watch?time_continue=576&v=6WRD46CPTS8)

 

Conclusions and Recommendations

This study is tentative and has a series of limitations, particularly, given the small sample of students and faculty engaged in assessing the practice competencies, and the consequent need to use non-parametric, descriptive statistics, as well as qualitative methods that may not have yielded valid and reliable findings. However, it has allowed us to reflect on existing data patterns from diverse sources and triangulate findings. The CLE-SCRA Practice Competencies rubric was a very useful tool to note gaps and opportunities in our curriculum and to deepen into indicators of the existence or lack of particular competencies. Elias et al, (2015) pointed out:

As no individual community psychologist is expected to possess or flawlessly execute all of the current 18 competencies (and those that will emerge in the future), practitioners will vary in their level of expertise competency by competency, based on education, training, personal values, and experience . . . All practitioners should have been exposed to the competencies, and their underlying values and principles, at some point of their career and ideally during their graduate-level professional training. However, the specific competencies they develop experience and expertise in are shaped by their training context, work settings, and lifelong learning that they pursue. The only exceptions to this line of thinking are the competencies considered Foundational Principles. (p. 48)

Overall, results from 2012-2016 showed a range between 2-2.8 (experience) in the attainment of foundational competencies suggesting that this requirement was met. This assessment allowed us to review our curriculum targeting the competencies that appeared less supported by our pedagogical interventions, and to reinforce those that appeared more strengthened. We hope to expand our participatory self-assessment by means of increased dialogue with the larger SCRA community and the global community at large, and look forward to your inspiring feedback and recommendations.

References

Elias, M.J., Neigher, W. D. and Johnson-Hakim, S. (2015). “Guiding Principles and Competencies for Community Psychology Practice.” In  Scott, V.C, and Wolfe, S. M. Community psychology: Foundations for practice. Los Angeles, CA: Sage.

Fetterman, D.M., Kaftarian, S., and Wandersman, A. (2014).  Empowerment evaluation:Knowledge and tools for self-assessment, evaluation capacity building, and acccountability.  Thousand Oaks, CA: Sage.

Fetterman, D.M. and Wandersman, A. (2005). Empowerment evaluation principles in practice. New York, NY:  Guilford Publications

Sarkisian, G. V., Saleem, M. A., Simpkin, J., Weidenbacher, A., Bartko, N., & Taylor, S. (2013). A Learning Journey II: Learned Course Maps as a Basis to Explore How Students Learned Community Psychology Practice Competencies in a Community Coalition           Building Course. Global Journal of Community Psychology Practice, 4(4), 1-12. Retrieved 5/5/2016, from (http://www.gjcpp.org/ ).

Society for Community Research and Action (2012). Competencies for community psychology practice: Draft August 15, 2012. The Community Psychologist, 45(4), 7-14.

Taylor, S., & Sarkisian, GV. (2011). Community psychology values- driven pedagogy: The foundation for empowering educational settings. Global Journal of Community Psychology Practice, 2(2), 19-30. Retrieved5/5/2-16, from (http://www.gjcpp.org/ ).

 

Download the PDF version for full article, including all tables, figures, and appendices.


Photos

Figure+1%3A+2012+and+2013+Fieldwork+Practice+Competencies.+
Figure 1: 2012 and 2013 Fieldwork Practice Competencies.
Table+1%3A+2015+Comparison+of+Scores+between+Fieldwork+I+%28FI%29+and+Fieldwork+II+%28FII%29
Table 1: 2015 Comparison of Scores between Fieldwork I (FI) and Fieldwork II (FII)
Table+2%3A+Frequency+of+Practice+Competencies%E2%80%99+Domains+in+Self-reflection+Essays+
Table 2: Frequency of Practice Competencies’ Domains in Self-reflection Essays
Table+3%3A+Frequencies+of+Practice+Competencies%E2%80%99+Domain+Reported+in+the+Narrative+Analysis+of+Newsletters
Table 3: Frequencies of Practice Competencies’ Domain Reported in the Narrative Analysis of Newsletters

Author(s)

Nuria Ciofalo, Susan James, Mary Watkins Nuria Ciofalo, Susan James, Mary Watkins

Nuria Ciofalo, Ph.D., is co-chair of the Community Psychology, Liberation Psychology, and Ecopsychology specialization. Born in Mexico, she gained her B.A. and first M.A. in Germany where she specialized in psychoanalytic theories, particularly Jung and Adler, and, her latter M.A. and Ph.D. in a community focused psychology program at University of Hawaii. From 1982-1987, she was a professor of psychology and chair of the Psychology Department for five years at University of Xochicalco (Mexico), for two summers in an M.A. psychology summer program at University of Las Americas. Since her doctorate in 1996, she has worked in wide variety of research situations in the U.S. and Mexico, training others to do research and managing and evaluating large-scale research projects. She has been a Senior Evaluation Analyst at The California Endowment. Recent publications include: Cultural-religious empowerment, popular power, and contra power: A demand for Indigenous Rights (2014). Revista de Psicologia Social Comunitaria 3 (2), and Heroes and martyrs against alienation: Growing up as puer and puella in post-modern society (2013). Global Journal of Community Psychology Practice.

Susan James, Ph.D., is co-Chair of the Community Psychology, Liberation Psychology, and Ecopsychology specialization and a community psychologist and digital media producer. Dr. James' work focuses on understanding cultural ecologies and disseminating social science research findings using visual design solutions and film, a practice she pioneered over a decade ago. James established innovative action research agendas and directed large-scale projects while holding senior positions at New York University, University of Chicago, and the National Center on Addiction and Substance Abuse (CASA) at Columbia University. Her own research focuses on structural violence as a determinant of well-being, and is published in American Journal of Community Psychology and Violence Against Women. She created Research Imaging Productions, a research and design consultancy that conducts social research and produces digital communications products for the nonprofit sector. She earned a BA from Sarah Lawrence College and a Ph.D. from New York University. Dr. James has been a guest member of the psychology faculty at Sarah Lawrence College, and a faculty member in the department of Africology at University of Wisconsin-Milwaukee.

Mary Watkins, Ph.D., is chair of the Depth Psychology Program and serves as Coordinator of Community and Ecological Fieldwork and Research in the Community Psychology, Liberation Psychology, and Ecopsychology specialization. She is a clinical and developmental psychologist and was an early member of the archetypal/imaginal psychology movement. She has worked in a wide variety of clinical settings and with groups on issues of peace, diversity, social justice, reconciliation, immigration, and the envisioning of community and cultural transformation. She is the author of Waking Dreams, Invisible Guests: The Development of Imaginal Dialogues, co-author of Toward Psychologies of Liberation, http://www.pacificabookstore.com/against-wall-re-imagining-us-mexico-border, andTalking with Young Children about Adoption, and Up Against the Wall:  Re-Imagining the U.S.-Mexico Border, and co-editor of Psychology and the Promotion of Peace.www.mary-watkins.net


Comments (0)

Add Comment

About this Article

Add Comment

PdfDownload the PDF version to access the complete article.

Printer FriendlyPrinter friendly version

Keywords: Program Assessment; Community Psychology, Liberation Psychology, Ecopsychology, Pacifica Graduate Institute; SCRA Practice Competencies