|Abstract: Evaluating professional development can assist with designing better programs in the future, yet survey instruments may not always capture the nuances of participant’s experiences. Therefore, in order to develop better survey instruments, the Out-of-School Time Resource Center conducted a series of five focus groups. Questions pertained to participants’ job-related needs, preferred types of professional development, characteristics of both “good” and “bad” workshops, reasons why new information is not utilized, and recommendations for policymakers/funders. Findings from the focus groups have been used to revise OSTRC pilot surveys, which will be standardized and published as an Evaluation Toolkit that can be used to design and evaluate OST conferences.|
The success of out-of-school time (OST) programs depends on having skilled, knowledgeable, and effective staff working with youth (e.g. Lauver , 2004) . However, professional development opportunities for youth workers are generally infrequent and/or inadequate (e.g. Halpern, 1999). The Out-of-School Time Resource Center (OSTRC) at the University of Pennsylvania is conducting a mixed method pilot study to design survey instruments that can assess the effectiveness of out-of-school time professional development in workshop and conference settings. These instruments include Post-Workshop Surveys, Follow-Up Surveys (completed one month later), Presenter Self-Assessments, and Overall Conference Surveys.
As of July 2005, the OSTRC has evaluated three major local and regional OST conferences and two networking meetings, during which the initial phases of the surveys were tested. The OSTRC also conducted a series of five focus groups with local OST program staff. The focus groups were designed to inform the development of the survey instruments by obtaining more detailed data regarding program needs, professional needs, and participants’ experiences with professional development in a variety of settings.
This paper discusses the findings from the OSTRC focus groups and outlines recommendations for providers, and planners, of professional development.
Data and Methods
The purpose of the focus groups was to determine how participants feel they benefit from professional development, specifically in terms of effecting positive change in participants’ skills, knowledge and attitudes. Two of the focus groups were offered to out-of-school time (OST) administrative staff, and three were offered to OST direct-service staff. A total of fifty staff participated in the focus groups, each of which was three hours long.
During the focus groups, professional development was defined as “Workshops, conferences, technical assistance, resource centers, peer mentoring, electronic listservs, and other supports designed to promote improvement, enrichment, and achievement in OST staff, programs and students.” Participants were asked a series of questions regarding their professional needs, experiences, and preferences.
Significant trends among the oral responses to each question are summarized below:
Job-Related NeedsParticipants were asked, “What do you need to do your job better?” Many participants made multiple comments in response to this topic. These comments varied widely in terms of their content. The following represents the seven most frequently stated needs; less frequent responses were collapsed into a single “other” category.
|Job-Related Needs||Frequency (N=190)|
|Increased communication within the organization / More staff meetings||21|
|Increased communication and/or involvement with parents||18|
|More physical resources||18|
|Participation in higher education||16|
|More opportunities to network with other OST staff||9|
Preferred Method of Meeting These Job-Related Needs
Participants were given a list of various types of professional development opportunities compiled by the OSTRC that was developed based on previous meetings and discussions with professionals in the field. In response to the question, “What is your preferred method of meeting these job-related needs?” participants cited the following preferences:
|Preferred Type of Professional Development||Frequency (N=21)|
|Formal Networking Group||5|
Characteristics of “Good Workshops” (N=164)
Participants were given the opportunity to answer: “What are some characteristics of good workshops you have attended?” The following characteristics were cited most frequently among all participants: incorporated physical/hands-on activities (n=42), covered relevant content (n=21), modeled new activities (n=17), provided new activity ideas (n=13) or provided relevant materials (n=7).
Each of these characteristics was associated with an increased tendency to apply what was learned in a workshop, to share this new knowledge with others, and to benefit program youth.
Characteristics of “Bad Workshops” (N=59)
Participants were then asked, “What are some characteristics of bad workshops you have attended?” Most frequently, they cited the following: content was not relevant/too basic (n=19), did not incorporate interactive activities (n=9), was a waste of time (n=8), or was used as a time to vent frustration without working to solve a problem (n=5).
Also cited were various characteristics relating to the presenter. Although responses represented a wide variety of characteristics, the six most frequently stated answers are below:
|“Bad Workshop” Characteristics Associated with Presenter(s)||Frequency (N=110)|
|Did not portray expertise in the topic||14|
|Did not maintain positive environment||14|
|Did not gain the respect of the audience||12|
|Did not provide time to ask questions||10|
|Used poor presentation skills||9|
|Was not well prepared or organized||8|
Reasons Participants Do Not Apply New Information Learned in Workshops
The OSTRC asked participants, “What are some reasons you don’t apply what you learned in workshops?” Participants most often responded that they did not use what they learned because of:
|Reasons New Information Learned in Workshops is Not Applied||Frequency (N=68)|
|Lack of support from other staff / Not all staff attended training||19|
|Lack of time||11|
|Content not relevant and/or practical||11|
|Not held accountable to apply new information||4|
|Presenter did not provide follow-up assistance||4|
|Workshop material sits in their “To Do” box or on their office shelf.||4|
Most Beneficial Component of a Workshop (N=84) ( Responses total more than 50 because some participants gave multiple responses.)
Workshops can promote changes in knowledge, changes in skill, and/or changes in attitude toward or appreciation of a topic. Focus group participants were asked, “What makes a workshop beneficial? Was it the most beneficial because of the knowledge/content you learned, the skills you acquired, or the change in your attitude/appreciation towards the topic?” Participants most often cited: changes in their attitude in the importance of this topic (n=33), then changes in their level of skill (n=29), and lastly changes in their knowledge (n=22).
Participants’ Recommendations for Policymakers and/or Funders (N=7)
Lastly, participants were given the opportunity to answer, “What recommendations do you have for policymakers and/or funders?” Participants’ recommendations were as follows:
• increase communication between OST staff and policymakers/funders (n=2),
• hold more focus groups (n=1),
• have more networking opportunities (n=1),
• have advocates represent OST program needs to policymakers/funders (n=1),
• balance the need for continuous learning with an appropriate amount of professional development (n=1), or increase youth participation (n=1).
Those who provide and/or plan professional development for OST staff could benefit from integrating some of these findings into future opportunities. For example, professional development for OST staff should include “Formal Networking Groups” and “Onsite Trainings” in addition to the more common format, “Offsite Training”. It is also important to encourage planners of professional development to provide workshops that are interactive, discuss relevant content, model new activities, and are presented by individuals who utilize adult learning theory principles within the training. Further, attention needs to be given to uses in the workplace that extend beyond the time period of a workshop. “Lack of support from other staff / Not all staff attend training” was cited most often as the reason participants do not apply what was learned within a workshop, once they return to their work settings. Therefore, it is important to schedule professional development at times that are convenient for many staff to attend. Lack of time within the workplace is another significant barrier to application of new knowledge. By allowing participants time within a professional development session to plan how they will apply new information learned, they might be more likely to use what they learn.
The findings from these focus groups have been used to revise the surveys in the OSTRC pilot study. Some trends have been added to the surveys as questions, while other information has been used to inform the analysis of the survey data. A few examples of additional survey questions include:
- Post Workshop Survey Questions:
- “Did more than one staff from your program attend this training?”
- “Was this session interactive or include hands-on activities?”
- “Did this workshop show you how to use new knowledge/skills?”
- “Was this workshop a good use of your time?”
- “Did the presenter(s) provide some form of follow-up assistance (i.e. contact information for questions, a website to reference, etc.)?”
- “Did the presenter(s) portray expertise in this topic area?”
- “Did the presenter(s) maintain a respectful environment?”
- “Did the presenter(s) gain the respect of the audience?”
- Follow-Up Workshop Survey Questions:
- “Were you held accountable to use this new knowledge/skill?”
- “Did the workshop material sit in your ‘To Do’ box or on your office shelf?”
The data analysis from the entire pilot study will contribute to a final revision of the surveys, which will be tested one last time. The OSTRC will then standardize and publish these surveys as part of an Evaluation Toolkit that can be used to design and evaluate OST conferences – conferences which optimally benefit staff, programs, and students.
Future research in this area could enhance OST professional development, on a national level, by further exploring these themes. These questions may also be beneficial if focus groups are replicated with local OST staff, and the information used to inform the development of subsequent opportunities.