Download the PDF version here.
The Summer Enrichment Experience (SEE) program initially started as a pilot project in 2010 in New Britain, Connecticut to serve elementary school children who performed at a deficient level based on their standardized test scores. Summer programming for these children was mandated by the Connecticut State Department of Education to combat the learning loss that occurs during the summer months when youth were not in school, known as the ‘summer slide’ (White, 1906). From the beginning, collaboration from the school district, community-based organizations (CBOs), and community members were essential for building on resources and developing quality programming. Initial data from this pilot program with 107 fourth and fifth grade children showed gains in various areas such as reading, writing, and math. The SEE program has been offered each summer since then, with the expansion of the program in 2014 and 2015 to serve approximately 600 children. Data from these subsequent summers has continued to show that the children maintain or gain in the areas of reading and spelling.
The SEE program is a collaborative partnership between diverse stakeholders including the school district, CBOs, and philanthropic foundations to enhance academic and personal growth for underperforming children. The children invited to participate in the program are those that score in the lowest 20th percentile of their grade level on their year-end standardized tests. Currently, the three-week program consists of 120 learning hours, including an academic component in the morning and an experiential component in the afternoon. The primary focus of the academic component is to maintain reading and spelling skills acquired during the school year. It also incorporates math and speaking/presentation skills. The children choose from a variety of experiential learning activities for the afternoon component of the program including urban gardening, digital storytelling, theater, dance, yoga, and Taekwondo, among others. Various CBOs provided these afternoon programs.
In the summer of 2014, the SEE committee selected and administered school engagement surveys at pre- and post-test for the program. Observations of the testing, discussions with teachers and staff, and reviews of the 2014 survey data revealed that the administration of the surveys had been inconsistent and a large amount of data was missing. In light of these data collection difficulties, we decided to work with the SEE committee prior to the start of the 2015 program to improve both the survey itself as well as the administration procedures. One goal of the survey revisions and pilot testing was to make the survey more developmentally appropriate for younger children. The school engagement survey revisions and pilot testing were completed as part of a community psychology graduate course in Spring 2015. We also decided to include interviews with the children at the end of the program to understand more about their experiences.
A university mini-grant funded the pre-test data collection and the SCRA mini-grant funded the post-test data collection in the summer of 2015. The surveys were administered by seven undergraduate and graduate students at each data collection period. These college students worked in teams of two at each site to administer the school engagement surveys to 10-14 classrooms per school site. The seventh student and the principal investigator assisted at all three sites. Over 400 surveys were completed at both pre-test and post-test. However, children who had not completed both pre-test and post-test surveys, or who did not have parental permission, had to be excluded from data analysis, resulting in 99 useable surveys. The overall findings showed that the SEE children at all three sites were highly engaged but experienced a decrease in school engagement by the end of the 3-week program. Results from the 28 children who were interviewed at the end of the program revealed that they enjoyed the program a great deal and thought that their reading scores would increase in the fall because of their participation in SEE. One nine-year-old male said, ‘Because every time I go home, I read for an hour. I’ve been doing it for the summer program’. A 10-year-old female said, ‘Summer program really helped me a lot. Told me how to spell out words and breaking words apart’.
Future Implications and Plans
Although the survey results showed that the children were less engaged at the end of the SEE program, the results have served as useful data in moving forward to increase engagement within the SEE program. It should also be mentioned that the school engagement surveys are one aspect of a larger longitudinal research study that will include school performance data, observations, and interviews. For example, the focus on using data to enhance the quality of the program includes observations of the teachers during the morning sessions and external evaluators hired to observe program providers in the afternoon sessions. The school engagement survey will be administered again in 2016 with attempts to obtain parent permission prior to the start of the program.
In addition to the activities mentioned earlier, a wider variety of experiential learning activities provided by the CBOs will be available for SEE in 2016, including art, poetry, photography, and Lego construction. Student choice will be another important emphasis for the 2016 summer program. The CBOs will continue to make activities available at different times during the afternoon programming to give children options to choose from. For example, last year the YWCA program used a center-based approach with different centers set up in the room for children to choose media blocks, dramatic play, or arts and crafts. Another CBO offered project-based work where children designed, researched, and implemented a project of their choice (e.g., one group of children chose to support a charity so they planned fundraising activities such as having a lemonade stand or bake sale to raise money). The focus on student choice and various modes of learning will continue to be a focus for the 2016 SEE program.
This attention to student learning, engagement, and choice will be critical as the program develops and continues to serve children in New Britain. The continued evaluation of their processes and outcomes will be important for determining the effectiveness of the program in combatting ‘summer slide’ and enhancing personal growth. Key to the program’s success is the solid, collaborative structure they have built over the years. While different members of the committee may come and go, the strong system structure that has been built will sustain the SEE program. The SEE committee, teachers, and CBOs are dedicated to provide high quality and interactive learning that will help these children move toward educational success and personal growth.
We would like to acknowledge and thank the members of the 2015 SEE Committee for their collaboration and support.
- Nancy Sarra, Nancy Puglisi, John Taylor, Kim Jackson, Donna Clark, Ryan Morgan, Melissa Abate, Jennifer Wright, Victoria Russell (Consolidated School District of New Britain)
- Tracey Madden-Hennessey (YWCA of New Britain)
- Maria Sanchez (American Savings Foundation)
- Kate Lincoln (United Way of Central and Northeastern Connecticut)
- Aviva Vincent and Jennifer Tousignant
White, W. (1906). Reviews before and after vacation. American Education, 185-188.
Nghi D. Thai and Sara M. Berry
Nghi D. Thai, is an Assistant Professor in the Department of Psychological Science at Central Connecticut State University (CCSU). She earned her Ph.D. in Community and Cultural Psychology at the University of Hawaii at Manoa and completed a NIDA T32 postdoctoral fellowship at The Consultation Center, Division of Prevention and Community Research, Yale University. Nghi is the primary faculty member for the Community Psychology master’s program at CCSU. Using ecological and mixed methods approaches, her research interests emphasizes community-based research including collaborations with several community-based organizations, foundations, and the school district in New Britain, CT to examine factors related to school engagement and success. In addition, Nghi is also involved in evaluation research, including a mixed-methods evaluation of an interdisciplinary team science consortium.
Sara M. Berry is an Adjunct Instructor at Manchester Community College (MCC). She earned her MA in General Psychology at Central Connecticut State University. Sara was one of the graduate students working on an initial collaborative project between CCSU and after school programs in New Britain. Other research projects Sara worked on while at CCSU included the development of a peer-modeling program designed to help undergraduate psychology students prepare for success in research in psychology. At MCC, Sara teaches General Psychology I and The First-Year Experience, a course designed to help first-semester college student adjust to the demands of college and actively engage in their education. Her primary research interest is aimed at fostering engagement and building resilience in college students while helping them develop the skills necessary for success in higher education.