Page 1

@NASPASLPKC

#SALead


Welcome from THE SLPKC CHAIRs

2

We’re excited to welcome you to our summer edition!

As always, please do not hesitate to contact us at slpchairs@gmail.com,

We are especially excited about this edition of the newsletter. Assessment and evaluation in student affairs is essential. Our students, institutions, and environments are dynamic and in order to perform to the best of our abilities, we must continually assess and evaluate programs, workshops, student satisfaction, and services provided. We would be remiss if we did not practice what we advocate. Therefore, in this newsletter edition, the results of our SLP KC survey on how we can better serve our membership are unveiled. Thank you to all the members who took the time to complete the survey. For those of you on semesters, we wish you a wonderful summer, for those of you on the quarter system—you’re almost there!

Sherry Early Bowling Green State University

Michael Baumhardt University of Scranton

PAGE TITLE

It was wonderful to see so many of you in Orlando! We hope you took advantage of the opportunities to engage with other SLP KC members and make meaningful networking connections. As always, we thank our sponsors, OrgSync and Jossey-Bass for their continued support. We also thank everyone who served as a mentor this year. As we embark on our final year as Co-Chairs, we welcome you to inquire about becoming a leadership team co-coordinator. A link to the OrgSync application can be found at https://orgsync. com/45737/forms/67981.


Remember these helpful summer tips! 1. Don’t get mad at students when they ask what you’re doing for your summer break. 2. Take some time to breathe between commencement ends and orientation sessions begin. 3. Summer is the perfect time to wrap up loose ends with your data evaluation from Spring or planning for the Fall. 4. Above all, breathe, you made it through another year!


Adam Cebulski works with OrgSync and holds his MSEd degree in Higher Education Administration and Policy (concentrating

4

on strategic planning and leadership development)

from

Northwestern

University. His background has focused on strategic planning and assessment for student affairs divisions and departments

Meet your Editors Tom Hurtado is a Communications Advisor for the Associated Students of the University of Utah (ASUU). He recently completed his MSEd degree at the University of Utah in Educational Leadership and Policy. In addition to working with a variety of student leaders across campus, Tom has spent the last three years implementing a marketing agency model within the student government increasing general student awareness and participation in ASUU initiatives, events, and services.

apply to lead

at a variety of institutions.


Leadership Positions Open Get even more involved with the SLPKC by applying for a leadeship position!

SLP KC Leadership Team strives to fulfill it’s mission by disseminating knowledge to its members. In order to fulfill this charge we have a select team of qualified professionals that put this vision into action. If you are interested in applying for one of these positions, please complete the application at: https://orgsync.com/45737/forms/67981 We have several positions open in a variety of areas on the leadership team. Serving on a board is a great way to gain a different type of expereinece and contribute back to the field. For more information or if you have questions, please contact Sherry Early or Michael Baumhardt at slpchairs@gmail.com.


6

The Student Leadership Development - Theory to Practice

PAGE TITLE

Congratulations to our theory to practice award.

Leadership is a growing field at make a difference in the UCF and Central

Florida community with thousands of hours

the University of Central Florida of community service each year.

and the office of Student Leader- As a means of targeting talented firstyear students and integrating them with

ship Development (SLD) is at the the academic and co-curricular aspects of forefront. SLD encompasses three leadership programs: LEAD Scholars, Lead Out Loud, and the Inspire Women’s Mentoring Program. These programs intend to attract different populations of leaders at the university and enhance their leadership skills by helping them become change agents in our society. Descriptions of these programs are below. The LEAD Scholars Program at UCF is a selective, two-year leadership development program for students committed to academic excellence and making a difference in the world around them. Learning about leadership in the classroom is only meaningful when the knowledge is put to use. LEAD Scholars guide and inspire their peers in a variety of ways on campus. The value of servant leadership is a preeminent value of the program. Our scholars

college life, the LEAD Scholars Program was established at the University of Central Florida in 1995.

The LEAD Scholars Program specifically assists in reaching the university’s goals of offering the best undergraduate education available in Florida, becoming a more inclusive and diverse institution, and becoming America’s leading partnership university. UCF is committed to accenting the individual and excellence – as well as educating the whole person and producing future leaders who exhibit leadership, academic excellence, and service. The LEAD Scholars Program exists within this context and is dedicated to providing highquality educational experiences and leadership opportunities to prepare students for positions of leadership both inside and outside the UCF community.


LEAD Scholars is more than a leadership program—it is a diverse and inclusive community at UCF. UCF and LEAD Scholars stand for opportunity. Accepted students in turn grasp this opportunity and turn it into reality by getting involved, learning about leadership and becoming change agents in our community. As the only academic and co-curricular leadership program in Florida, LEAD Scholars offers many exciting opportunities in academics, leadership opportunities, student organizations, research, and campus life to our students. LEAD Scholars is a spring board to leadership at UCF. Lead Out Loud is an experiential program that focuses on the leadership development of Multicultural students at the University of Central Florida. This program promises to enhance leadership skills, strengthen life skills and encourage positional leadership in this population of students at UCF. These are essential learning outcomes for students who will not only be retained by the university, but will be able to serve as effective leaders on campus and in the community. The purpose and spirit of the Lead Out Loud Program will empower students to succeed, embrace diversity, strengthen life skills, and develop purpose with the expectation these tenets will promote Multicultural students towards graduation. One of our highest goals is to create a legacy within this program that will empower students to become social change agents and continue the learning cycle by training their peers in leadership. The mission of the Inspire Women’s Leadership Program is to provide leadership enhancement and empowerment opportunities for female students. Through networking, mentoring, personal development and experiential training, UCF women students will be prepared to hold various leadership positions on campus and in the work arena. Women of the past may have felt excluded in leadership processes because traditional views of leadership primarily included men. Society is now noticing that women’s collaborative nature and spirit can give them a strong advantage while leading others. In light of this, the Student Leadership Development office has developed a focused program to meet the specific leadership needs of UCF’s female students. If you have any questions about SLD contact Dr. Stacey Malaret Director, Student Leadership Development University of Central Florida 407-823-2223 Stacey.malaret@ucf.edu


Your Voice Has Been Heard

8

Findings from the Membership Feedback Survey In 2013, the SLPKC Leadership Team

examination of the knowledge commu-

implemented a Membership Feedback

nity was completed, providing neces-

Survey, hoping to better understand

sary, but now outdated information. Thus,

if the KC is reaching its membership

throughout the early months of 2013,

effectively and to assess overall satis-

the group designed and implemented a

faction.

Membership Feedback Survey, hoping to

The Student Leadership Programs Knowledge Community (SLPKC) seeks to foster a dynamic, resource-rich environment for higher education professionals who have a professional interest in collegiate leadership training, education, and development. As the largest NASPA Knowledge Community, the SLPKC leadership team found itself recently discussing the need to learn more about its members. In 2007, the most recent comprehensive

better understand if the KC is reaching its membership effectively and to assess overall satisfaction. The survey was designed through collaborative question development, with many team members contributing to specific questions and survey’s overall design, implementation, and analysis. This article will focus on the findings from this feedback survey, summarizing important responses and analyzing potential gaps.


PARTICIPANT DEMOGRAPHICS To further examine the SLPKC Membership Feedback Survey, it is first important to understand the demographics of the 130 respondents. Participants represented a variety of functional areas, including Leadership Development Programming (72%), Student Activities (59%), Service Learning and Volunteerism (27%), Residence Life (20%) and Multicultural Affairs (19%). 60% of respondents identified as having obtained or currently pursuing a master’s degree with a higher education and/or student affairs focus. The majority of individuals who took this survey indicated that they worked at either a 4-year public or private institution.

These aforementioned results seem to indicate that the SLPKC caters to a wide variety of members from diverse professional backgrounds. These results also indicate that there are possible gaps in our outreach, particularly when it comes to including newer professionals, senior executives, and members from institutions other than 4-year traditional college/universities. Lastly, not every NASPA region showed representation in these survey results, indicating that the SLPKC leadership team may need to coordinate additional outreach and programmatic/resource support to members from underrepresented regions. Additionally, this survey asked what theories or models are being utilized, in order to gauge national trends in leadership development. The majority of respondents (53%) indicated that their institutions use the Social Change Model most regularly, followed by the Leadership Challenge (40%), Servant Leadership (33%), and Emotionally Intelligent Leadership (21%). A smaller number of respondents (17%) indicated that they utilized an “in-house” leadership model at their institution. The programmatic options offered most frequently include workshops (83%), retreats (65%), and conferences (57%) Finally, assessment practices varied, but predominately consisted of surveys (85%). Other forms of assessment included journal reflections, pre-and post-tests, focus groups, and peer evaluations. Ultimately, the results showed consistency amongst respondents and will provide the leadership team with a better understanding of what members are offering nation-wide. KNOWLEDGE COMMUNITY FEEDBACK In addition to learning more about our KC members, the leadership team wanted to know how effectively it has been in delivering services to its members. Overall, respondents indicated that the SLPKC is successful at providing professional development opportunities to its members, with over 80% of respondents stating that the KC was effective or extremely effective in this area. Similarly, 77% of respondents viewed the KC as effective or extremely effective at educating members on the


One method of professional development that has been particularly impactful is the SLPKC-sponsored Webinar Series. While only 36% of respondents stated they had attended an SLPKC-sponsored webinar, nearly 85% indicated the experience was satisfying or extremely satisfying. Among those who have not attended a webinar, almost half indicated that the webinars do not fit within their work schedule. Only 4% stated that they were disinterested in the topics offered.

10

The SLPKC Newsletter also serves as an effective mode of professional development. 80% of respondents stated they had read an issue of the SLPKC Newsletter, with a staggering 98% rating the newsletter as satisfying or extremely satisfying. Again, only 4% of respondents have not read an issue of the newsletter because of disinterest in the topics presented. These results seem to indicate that the webinar program and newsletter have proven to be fruitful methods of professional development and information dissemination. When it comes to hearing from experts in the field, respondents preferred the following methods:

One of the areas for improvement demonstrated by this survey was the connection between the members and the SLPKC Leadership Team. Over 73% of respondents reported feeling not connected or extremely not connected with SLPKC leadership. Additionally, 71% of respondents reported a similar sentiment with respect to regional representatives. Overwhelmingly, respondents reported a desire to connect with leadership members and regional representatives, along with becoming more involved in the knowledge community in the following ways:

Overall, the results indicate that, while members report a significant lack of connection with SLPKC leadership and regional representatives, there are well-established methods for creating a meaningful


WE’VE HEARD YOU! As with any assessment, the usefulness of a survey can only be measured by the impact it has on program improvements. The SLPKC leadership team has had a chance to examine the findings of this assessment and spent some time discussing its implications. Overall, the low response rate, and feedback from participants who indicated feeling disconnected from KC leadership indicates that more outreach must be done to ensure that all members feel a strong connection to the community. One of the key ways to engage our members seems to be working through the regional representatives to allow members to assist with planning professional development opportunities. In particular, regional drive-in events and SLPKC meet-ups at annual conferences may allow members to meet KC leadership face to face. Another area the leadership team is looking to explore is increasing the use of social media to interact with colleagues from across the country. By improving conversations taking place on Facebook, Twitter or even Google Hangouts, members can share best practices and connect with others on their own schedule. Finally, the SLPKC leadership team intends to be more intentional with its approach to professional development, particularly when it comes to promoting the various opportunities that exist for members of the community. It seems that people who read the newsletter or attend a KC webinar are extremely satisfied, there are still a number of people who are not aware these modes of professional development exist. Rather than trying to focus on constantly generating new knowledge, the team needs to utilize and promote already existing resources. In addition, knowing our core audience and the leadership development options offered at their campuses will allow us to focus on tools that will help members improve their own programs. In conclusion, the SLPKC leadership team wants all its members to know that “we’ve heard you.” A survey like this is a great start, but more important will be the change it inspires. Article authors: Matthew Clifford Director of Residence Life Wake Forest University SLPKC Resources and Recognition Committee Chair

Kimberly Kushner Interim Coordinator for Student Development and Leadership The University of Colorado, Boulder SLPKC Literature Review/NCLP Coordinator

Kimberly Piatt Coordinator of Leadership Development The College at Brockport SLPKC Region 2 Knowledge Community Representative


Congrats to our members

congratulations!

12


We have a new SLP baby! Beautiful and healthy Emma Morgan Flynn made her debut April 24th at 9:55am. Emma is 7lbs 15.3oz and 21.5 inches long. From what her Mommy Amber says it was love at first sight! Congrats to the Flynn family!

Congratulations to Mike Baumhardt, SLPKC Co-Chair, for obtaining his MBA! Smartest Co-Chair ever!

Congratulations to Adam Cebulski, Newsletter Co-Chair, for being appointed to the newly created position of Director, Research and Strategic Iniatives at OrgSync, Inc.

Congrats to Joe Ginese and Robyn Kaplan on their wedding on May 26, 2013!


14

Becoming a competency-based organization In 2008, The University of Arizona Leadership Programs staff made the decision to become a competency-based organization, a decision that has had monumental impact on program design, student development, and assessment practices. The first task in the process of becoming a competency-based organization was to create a list of competencies related to leadership that we wanted our students to develop. Competencies can be defined as knowledge, values, abilities, and behaviors associated with effectiveness in completing a task or engaging in a role and can be developed through training and experiences. In creating this list, we drew from competencies embedded in a variety of leadership models and standards as well as those expected of over 500 academic programs across multiple career fields. Once the list was complete, we designed measurements for each competency. We then examined our programs and processes to find creative ways to implement the competency approach in a holistic way. Below outlines some of the methods of our competency based approach.

Competency Setting Using our list of leadership competencies, we have mapped out the specific competencies associated with participation in each experience we offer (event, program, initiative, role, and course). Through this process, we have been able to identify any critical competencies that we do not focus enough on through our offerings and make curricular adjustments to integrate these competencies more intentionally. We are also able to offer tailored experiences based on the competency needs of a specific audience (especially since the competencies are mapped to different academic program requirements). For experiences open to the general student population, our staff has narrowed our competency list to the 35 most critical competencies based on research on student retention, contemporary leadership, career readiness, and academic program expectations. Focusing on these 35 competencies has helped us hone our strategic plan into specific measurable objectives. The competency setting process has provided a way for us to develop experiences that fit student needs in a way that is far more intentional and measurable than we had been using. In addition, the competencies


associated with each leadership experience are posted on the website, in syllabi, in job descriptions, and in marketing materials to increase transparency and help.

Evaluation Another way we use the competencies is to evaluate student development. We have created a template of measurements for each competency and use the same measurements across all experiences. This allows us to benchmark between experiences as well as collect aggregate data on a particular competency. The measurements are quantitative self-reported one item prompts asked of students post-experience to gauge their perceived development of a particular competency. In addition, students are asked two open-ended questions regarding the experience to code for additional competency development not captured in the quantitative responses. Evaluations are administered on paper or online (web link and QR code) depending on the experience. Once data is collected, it is entered into a spreadsheet that automatically computes averages and percentages on a variety of components for over 500 experiences annually; data can be disaggregated by experience and competency within an across experiences. Collecting data in this manner allows us to be able to do a better job quantifying student development. We use this data to tell our story for grants, share with donors, integrate into marketing materials, provide to academic advisors to help advise their students into co-curricular experiences to enhance their major, as well as highlight it in parent/alumni newsletters. We also use the data to help us evaluate our programs to assess for gaps, areas of improvement, and areas of strength. By knowing what students need, we can be mindful about developing intentional leadership experiences for them and then telling our story to others in a compelling manner. The University of Arizona uses the Student Leadership Competencies which were first published in the Journal of Leadership Studies in 2013 and will be released in the coming months as a guidebook through Jossey-Bass. Dr. Corey Seemiller Director of Leadership Programs University of Arizona Corey received her Bachelor’s degree from ASU, her Master’s degree from NAU, and her PhD from the UofA. She prides herself on being a true Arizonan! She is the Blue Chip Global Theme Coordinator, teaches several leadership classes including Global Leadership, Leadership for Social Change, Organizational Leadership, Leadership Strategies, and Foundations of Leadership. She also oversees Leadership Courses for Credit and the Minor in Leadership Studies and Practice, the Equiss Social Justice Experience, the IBM Co-Op Program, and engages in other administrative functions to support all leadership programs.


Leadership Program Assessment and Evaluation: Coming to a Theater Near You

16

accrediting bodies within the leadership discipline does not preclude leadership programs from accreditation reviews— whether regional or institutional—among our colleges, departments, or programs. Thus, it is imperative that each program has an assessment plan and that leadership educators know what resources are available for this important, and often requisite, work.

Unprecedented growth in this field has created a challenge that—while old news to seasoned academics and their respective deans and chairs in our oldest disciplines—is new to many leadership educators, program directors, and department chairs. This is the challenge of assessment and evaluation. While some standards and guidelines exist (e.g., Student Leadership Programs Council for the Advancement of Standards in Higher Education (CAS) Guidelines, International Leadership Association (ILA) Guiding Questions: Guidelines for Leadership Education Programs) these resources are in no way accrediting documents, nor do they pretend to be. Yet, the absence of

The primary purpose of this paper is to call leadership educators to intentionally engage in a conversation regarding how we ensure participants of our programs are learning what we hope they learn. To achieve this principal aim, this paper will (a) briefly examine pressures compelling our discipline to engage in intentional conversations regarding assessment and (b) provide an overview of ways and means of assessment activities. (p. 1)

PAGE TITLE

At many institutions, curricular and co-curricular leadership programs are the new game in town. These leadership programs range from student to academic affairs and today nearly 2,000 such programs in the form of advanced professional, doctorate, and graduate degrees to major, minor, or certificate programs persist (Brungardt, Greenleaf, Brungardt, & Arensdorf, 2006; ILA, 2013). Yet, as a chiefly academic field of study, Leadership Studies is still in its infancy (Hackman, Olive, Guzman, & Brunson, 1999). The first degree-granting leadership programs were initiated in the early 1990s while the majority came to be in just the last decade.

Formative leadership program assessment is as infant as the discipline. To illustrate this, Goertzen (2009) called to attention Journal of Leadership Education readers in the following abstract:

Unfortunately, Goertzen’s call often goes unnoticed—until needed. Academic leadership programs were often sprouted from co-curricular programs or developed at an institution as a result of one administrator’s particular interests. In many of


these cases, plans for formal program evaluation and accountability to learning outcomes were not part of the early blueprints. It is also fair to say that many of us are unclear as to what we are accountable for. How do we know what accreditors, deans, chairs, or VPs want or need to know? I have found it helpful to explore what exemplary programs in our field are doing. A resource that drew my attention was a document called: “This is how we do it… at Kansas State University School of Leadership Studies… Assessment!” Dr. Irma O’Dell, Kansas State University (K-State) School of Leadership Studies Senior Associate Director and Associate Professor shared this assessment overview during a roundtable session at the 2012 Association of Leadership Educators annual conference. I have invited Dr. O’Dell and Dr. Kerry Priest, Assistant Professor of Leadership Studies, to share a brief overview of the current assessment practices used at K-State in the School of Leadership Studies: This is How We Do it… Assessment in the School of Leadership Studies at K-State The evaluation of student learning is commonly referred to as assessment. Angelo and Cross (1993) define assessment as an ongoing multidimensional process of appraising the learning that occurs in the classroom before and after assignments are graded with the feedback used to improve teaching and subsequently student learning. At K-State, assessment of student learning begins and ends with the Office of Assessment whose mission is to support continuous improvement processes through facilitation of meaningful assessment of student learning and effective methods for feedback

and action in response to assessment results (2013a). Two key questions drive our assessment practices: a) What are our students learning? and, b) How well are our students learning what we are teaching? In the K-State School of Leadership Studies, assessment of student learning begins with our mission statement, “Developing knowledgeable, ethical, caring and inclusive leaders for a diverse and changing world (2013b). Guided by this mission, leadership studies faculty have been engaged in a continually evolving process of generating and refining learning outcomes that align with the four core courses of our Leadership Studies Minor (2013c). In Spring 2012, these outcomes were revised to include both student learning outcomes and student development outcomes, as outlined in Table 1 (following page). The assessment process is initiated by faculty. Every core-course syllabus includes a list of all learning outcomes and student development outcomes, with the outcomes that specifically pertain to that course boldfaced. Our faculty use the learning outcomes to focus their teaching. They form course teams to develop and discuss course content, texts and readings, assignments, grading rubrics, and assessment strategies that align with the learning and development outcomes (O’Dell, 2009). The work done in team meetings encourages collaboration and consistency in pursuit of these outcomes. Outcome assessment assignments look different for each course; however, they have generally been in the form of a final project, presentation, or paper


and evaluated with a structured rubric. Upon completion of the semester, data in the forms of rubric scores and/or qualitative feedback are recorded on a spreadsheet template and submitted for analysis. Additionally, faculty members review and provide feedback on the data by responding to reflection questions like:

PAGE TITLE

18

• How will the data inform your teaching as you prepare for the upcoming semester? (O’Dell, 2009)

• Does the team plan to include additional assignments (graded and nongraded) for assessment?

Once all data and feedback is gathered, an annual report of student learning is written and submitted to the Office of Assessment … and the process begins again! As a School we desire to continue to challenge ourselves to conduct assessment and use the data in a meaningful way. This involves prioritizing involvement in intentional professional development opportunities around student learning, student development, and assessment and evaluation strategies.

• After reviewing the data, do you have any concerns with the findings? If so, what adjustments, if any, do you plan to make to address the concerns?

To further illustrate the impending assessment or evaluative initiative coming to a “theater near you,” allow me

• Does the team plan to make any changes to the assessment assignment? If so, what changes?


to share with you two brief examples in each of the two major arenas that leadership programs are often delivered. The first occurred while I worked from 20082012 as an adjunct professor of leadership studies and academic program coordinator with a program inside an academic-student affairs partnership at the University of South Florida (USF). Here, a once co-curricular program housed within the USF Center for Leadership and Civic Engagement (CLCE) (2013a) collaborated with the College of Undergraduate Studies (2013b) to offer an academic minor. At USF, we had a habit of “accountability for sustainability,” the premise of which was that if we were accountable to the “higher-ups” regarding what and how we were doing, we would not be axed. We did this on our own accord and did eventually supply a great deal of data to a Southern Association of Colleges and Schools (SACs) accreditation report for our leadership studies minor. We based our data collection and assessment process chiefly on the K-State model described above by Dr. O’Dell and Dr. Priest. (Todd Wells, the Associate Director of the CLCE was a K-State alumnus. He suggested we use it!) Our second key resource was the ILA Guiding Questions (ILA, 2009), created to “assist anyone who wishes to develop, reorganize, or evaluate a leadership education program.” This document is exceedingly helpful and includes myriad open-ended questions that guided our program staff and adjunct professors to consider, and sometimes redefine, the context and environmental factors of our program, the conceptual framework within which we operated including our program mission and vision, the content of our courses and curricular impacts, our teaching strategies, learning objectives, and assessment practices, and finally our methods for measuring learning

outcomes. The second example is quite contemporaneous and occurred during my first semester as a tenure-track assistant professor of leadership and organizational studies at the University of Southern Maine’s (USM) LewistonAuburn College. Right out of the gates, I was met with the challenge of a formal institutional program review (the “every ten years” variety) and more recently a second institutional request for an “assessment of student learning plan: academic programs” resulting from a New England Association of Schools and Colleges (NEASC) request “… mandating an update on our progress on the assessment of student learning across all academic programs.” Here are some examples with regards to assessment of student learning requested by NEASC: • Has your department identified any student learning outcomes? • How and when will the learning outcomes be assessed? • How did you use the assessment results to improve student learning? So then, how might one measure and respond? Luckily, I had kept my “accountability for sustainability” habit from USF and somehow gained the consensus of my new program faculty to let me facilitate an assessment of each of our core courses within our BS in Leadership and Organizational Studies (2013). On a macro level, this entailed reviewing the learning outcomes of the program as a whole as well as each course individually. Then, we looked for consistency across courses evaluating and making shared decisions on everything from course and catalog descriptions to required textbooks and


and their rationale to key activities, assignments, and assessment. While keeping academic freedom keenly in mind, we make important considerations, had provocative discussions, and rejuvenated our understanding of our program. Dr. Daniel Jenkins Assistant Professor of Leadership and organizational Studies University of Southern Maine, Lewiston-Auburn College 207-753-6592 djenkins@usm.maine.edu

PAGE TITLE

In both of the examples I shared, we utilized both the K-State and ILA GQ models to provide “method to our madness.” Luckily, these resources exist. My advice to you: be prepared! Know the available resources and consider ways your program can stay accountable. I shared two example assessment prerogatives and showcased and exemplar resource. Bottom-line, the process is as taxing as it is challenging. Nonetheless, it is also engaging and inviting, creating a productive forum for collaboration and programmatic dialogue. I was even privy to a senior faculty member commenting how, at first, she dreaded the idea of the process, but, once we got going, was energized by the exchange of knowledge and the learning that took place among the program faculty. One final tip: you might as well make it fun—have a potluck, bring candy, etc.—because a mandated program assessment or evaluation is coming to a theater near you!

20

Dr. Irma O’Dell Senior Associate Director for Administration and Associate Professor of Leadership Studies Kansas State University 785-532-6085 irmao@ksu.edu

Dr. Kerry Priest Assistant Professor of Leadership Studies Kansas State University 785-532-6085 kerryp@k-state.edu


References Angelo, T. A., & Cross, K. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco: Jossey-Bass. Brungardt, C. L., Greenleaf, J. P., Brungardt, C. J., & Arensdorf, J. (2006). Majoring in leadership: A review of undergraduate leadership degree programs. Journal of Leadership Education, 5(1), 4-25. Council for the Advancement of Standards in Higher Education (2012). Student leadership programs. In CAS professional standards for higher education (8th ed.). Washington, DC: Author. Retrieved from http://www.cas.edu/getpdf.cfm?PDF=E86F4088-052E-0966-ADCB25F2A9FE7A70 Goertzen, B. (2009). Assessment in academic based leadership education programs, Journal of Leadership Education, 8(1), 148-62. Retrieved from http://www.leadershipeducators.org/ Resources/Documents/jole/2009_summer/JOLE_8_1_Goertzen.pdf Hackman, M. Z., Olive, T. E., Guzman N., & Brunson, D. (1999). Ethical considerations in the development of the interdisciplinary leadership studies program. Journal of Leadership Studies, 6, 36-48. International Leadership Association (ILA). (2009). Guiding questions: Guidelines for leadership education programs. Retrieved from: http://www.ila-net.org/communities/LC/GuidingQuestionsFinal.pdf International Leadership Association (ILA). (2013). Directory of leadership programs. Retrieved from http://www.ila-net.org/Resources/LPD/index.htm Kansas State University. (2013a). Office of assessment: Welcome to the office of assessment. Retrieved from http://www.k-state.edu/assessment/ Kansas State University. (2013b). School of leadership studies: Our mission. Retrieved from http:// www.k-state.edu/leadership/ourmission.html Kansas State University. (2013c). School of leadership studies: Assessment. Retrieved from http:// www.k-state.edu/leadership/about/assessment.html O’Dell, I. (2009). Assessment and accountability in higher education. Educational Considerations, 37(1), 4-7. University of South Florida. (2013a). Center for Leadership and Civic Engagement. Tampa, FL: Leadership Studies Minor. Retrieved from http://www.leadandserve.usf.edu/leadcourses.php University of South Florida. (2013b). College of Undergraduate Studies. Tampa, FL: Leadership Studies Minor. Retrieved from http://www.ugs.usf.edu/academic/lsminor.htm University of Southern Maine. (2013). Lewiston-Auburn College. Lewiston, ME: BS in Leadership and Organizational Studies. Retrieved from http://usm.maine.edu/leadership/bs-leadership-andorganizational-studies-0


22

looking Back at Mentorship Our goal is to bring the advantages of this program to even more graduate students and professionals at the 2014 annual conference. If you would be interested in being a mentor or mentee, please contact Dave Borgealt (dborgeal@depaul.edu) or Gabby Mora (mgm92@drexel.edu).

program eval

results


As we reflect upon our own leadership development journey, most likely we would agree that mentoring relationships can be critical in our development as leaders and professionals. Mentoring relationships can provide safe spaces to give and receive feedback that increase self-awareness. They are also a great context within which we can ask questions to help make meaning from past decisions and actions and to consider options for future decisions. At the 2013 NASPA annual conference, the Student Leadership Programs Knowledge Community Mentorship Program provided the opportunity for professionals working in the field to mentor graduate students interested in student leadership programs. This March, 41 individuals took advantage of this opportunity. Participants included 21 graduate students and 20 professionals. A post-experience survey was sent to each participant with a 32% response rate, including an almost equal split between mentors and mentees. 100% of the respondents said that they would recommend this program to a colleague/peer. Most of the respondents were able to meet two times or more with their mentor/mentee. When asked to describe what was most valuable about the mentorship relationship, one mentee said “she (my mentor) was a great help to break down NASPA because it can be extremely overwhelming. She also encouraged me to be more involved in the Knowledge Community. I loved attending the awards banquet with her!” Another mentee shared that she found her mentor’s “perspective on what initiatives work well for various student populations and how to connect leadership programming to students’ career or professional development” most valuable. Participating mentors also found the program to be beneficial. One mentor said that, “it was nice to get to meet a new professional in student affairs and hear her professional journey and offer

some advice about how she can get more involved in leadership and NASPA.”


One part idea and two parts assessment for Program Development

24

This article is based on the Student

a unique contribution for the entire

Leadership Program Knowledge

campus community (Styles, 1985). In

Community webinar on May 1, 2013:

some cases, student affairs profes-

The idea is only the beginning of

sional are the point person developing

program development fundamentals.

and implementing new campus programs

This article will focus the importance

directly for the benefit of the students.

of assessment at two key phases

In this case, we function as the active

within the planning process as well as

programmer. In many situations, we are

the caveat that the assessment plan

the mentor programmer functioning as

has to run concurrently with the entire

the adviser to a group of students that

program development process..

will take an idea and create a program

The creation and implementation of programs for college students is a key piece of how we translate theory into practice. The student affairs professional, for our purposes the leadership development professional, provides

for other students on campus (Whitney, Vandermoon, & Mynaugh, 2103). This two-pronged approach to program development is assumed in the field and in the literature, but has not been defined with the labels of active and/or mentor programmer until now.


The essence of program development as one of our “chief functions” (Styles, 1985) was discussed in the Student Personnel Point of View (1937) and directly in the 1949 document. “The principal responsibility of all personnel workers lies in the area of progressive program development,” (1949, p. 33). It is no coincidence that these same foundational documents advocate the inclusion of assessment and evaluation in the delivery of programs. Through this article, we will explore the import of assessment in the process of program development process. Brief History of Program Development The sentiment that student affairs practitioners “spend a significant portion of their working day planning, implementing, and evaluating” (Styles, 1985 p. 181) programs is corroborated by many authors over time. In Learning Reconsidered, (2004) we find programs/programming referred to 70 times within those 43 pages. Additionally, Learning Reconsidered 2 (2006) triples in numbers (to 267 mentions in 100 pages) program/ programming highlighting the prominence of this topic in our work (Barr & Keating, 1985; Styles, 1985, Cooper & Saunders, 2000, Saunders & Cooper, 2001). The CAS (2009) standards provide tremendous resources for implementing programs (e.g., Student Leadership Programs) and they help highlight important considerations for campus programming. CAS does not present a particular program development model; which does not minimize or dismiss the use of CAS. The standards are operational used to as a formative evaluation tool to improve or expand (in some cases to begin) divisional/departmental programs (Bryan

& Mullendore, 1991; CAS, 2009). Program development models first appeared in the literature in 1974 with the Cube model. Over the last 39 years there have been 21 different models introduced throughout the student affairs literature. Some of these have been presented in the two handbooks used within the field of student affairs. However, there is currently no one repository containing all of these models in one place. Assessment and evaluation is directly addressed in half of these historical models as one or two steps in the process. The other half address assessment indirectly. The best use of assessment is a two-part process – the first is in the beginning as we define the program to determine need and viability. The second is at the end to analyze the data collected during the assessment process throughout the duration of the program; which really means we have to do our work at the beginning to be successful (Cooper & Saunders, 2000). Failure to plan a viable process and incorporate viable data early on in the process will inhibit the usefulness of your data. A few program development models relate directly to leadership programs. The Student leadership program model (Roberts & Ullom, 1989) is a six-step model that includes: the planning team, institutional appropriateness, resources, program design and finally assessment and evaluation. Notice a two-step assessment approach (i.e., early stages and the final stages) is used. The two editions of the Handbook for Student Leadership Programs present a variation of the same Leadership program model. Both seem to be modeled after the original cube models of program development, although the references do not seem to connect


back. These models discuss three dimensions important to categorizing or creating a campus student leadership program. This author has mapped all of the 21 program development models into a contemporary model that presents the program development process into five phases (see figure 1). The model is presented in a modified Gantt chart to incorporate timing and illustrate the integration of each phase with the others. Program Definition is the germination phase including an environmental scan, identifying the programmatic purpose, goals, objectives, theoretical framework, initial assessment, attention to divisional/institutional mission, budget and fiscal considerations, creation of the planning committee. The Program Planning phase includes committee chairs selection, establishing the timeline and a backdating system, identifying key performance indicators with target dates, an inclusive risk management plan, and initiate marketing plan. Program Execution is an implementation phase and simultaneously ties together all pieces of the model into a cohesive approach. This is where the backdating, key performance indicators and target dates become a reality. This execution phase moves the program from an idea toward the actual event (or series of events). Program Monitoring is also a function of control and formative evaluation of the overall program. Monitoring requires the committee to be nimble enough to anticipate potential problems and apply corrective action if needed. This phase is connected to the planning phase because without adequate planning the execution phase could be problematic. The “event� (i.e., Day-of) will happen during the monitor phase. The actual event is the culmination everything it takes to create and

implement a program successfully. The Program 26 Closeout phase includes the collection, analysis and presentation of the evaluation and learning assessment data, closing out the financials, thank you notes, and an after action report synthesizing the whole process into a document for administrative decisions and/or the next iteration of the program. Caution Best Practices Ahead Best practices in the field of student affairs can be a gold mine of information and a way to share information and successes among institutions. The danger of best practices is when we see them as a short cut to program development and implementation. I submit that in some occasions Best Practices becomes theory of the last conference. Some of the initial assessment pieces could be overlooked in the use of best practices. Since someone else already looked at them and we are all working with college students that makes sense right? No. The institutional fit, timing, student demographics, campus culture, place, budget, and committee orientation could influence the success of the program. Some similar concerns have been addressed with the suggestion that assessment and some parameters around best practices is probably an area that could use some suggested practices about how to adopt great ideas (Shutt, Lynch, & Dean, 2012). Best practices do bring consistency and is an efficient use of ideas and networking, but there should be more assessment (at the definition and the closeout phases) for the program.


Conclusion Program development is one of the key ways in which the student affairs professional contributes to the overall institutional mission (Barr & Keating, 1985; Styles, 1985, Cooper & Saunders, 2001; Saunders & Cooper, 2001). This is true for both the active programmer applying our expertise directly to the target populations (e.g., individuals, student groups), or the mentor programmer advising a student group. Either/both are programming and program development. The conduit to proper programming is proper planning and implementation. To do this it takes more than a good idea, or the best practices. The idea is only one part of the whole process. As any programmer can attest the space between idea and event is filled with numerous duties, meetings, plans, and deadlines. Assessment has to be included at the beginning to assess feasibility and need and at the end to assess the learning outcomes. Hence, there are two parts assessment. “There is no best program development model, but we hold that those that do plan programs well normally use a program development model” (Schuh & Triponey, 1993, p. 430). The role of assessment and accountability when delivering leadership development/ education (all programming actually) will improve our planning and implementation of programs. If there were no need for a system, our own CAS standards would seem superfluous. With the number of programs happening on campuses across the country there has to be some model used by those in the role of program developer. A quick survey of masters preparatory programs through the NASPA and ACPA websites indicates there are very

few, if any, courses dedicated to the skill and competency of program development. It seems that we are leaving a vital role, the chief function of student affairs, and the “principal responsibility” (SPPV, 1949, p. 33) of our profession to chance and assumptions that one can do it by virtue of working in the field, and on-the-job training. To uphold the contributions of student affairs to the curriculum we must pay attention to learning and development happening in programming.

Rich Whitney, PhD Asst Professor, College Student Development Counseling & Special Education, College of Education DePaul University 773-325-4065 Twitter: @iamrichwhitney


28

Leadership Highlight Reflections on How Women’s Student Leadership Experiences Prepare Leaders in STEM Careers Fields

I recently had the unique opportunity to interview two Wentworth Institute of Technology (WIT) women student leaders and asked them to share their experiences and perspectives on leadership at the institution. Both students, Susi Vasquez Trujillo and Bri Bonfiglio, shared remarkable reflections and takeaways from their experiences at a majority-male institution and aspirations to be leaders in majority-male STEM career pathways. MH: What inspired you to get involved as a student leader? Bri and Susi noted the value of becoming connected to campus in their first year at the institution. BB: The initial orientation experience inspired me; somebody made a difference in my life and I wanted to do the same for other students. SV: Several role models I had as a first year student inspired me. I looked up to my SRA [Senior Resident Assistant]. It is also part of my personality to become involved and take on leadership roles.

MH: What was your experience to initially get involved as a student leader like? BB: It was easy and I was welcomed to get involved on campus. I noticed a lot of influential women on the professional staff, including the director of orientation, and other great women leaders. It was comforting environment to me as a student and I knew I wanted to help do that for someone else. SV: I noticed how women are involved in student organizations and provided important leadership to the core group of student leaders. In ASCE (American Society of Civil Engineers), women leaders are influential in keeping the team together and play many important roles. MH: What challenges do you face as a women student leader at a majoritymale institution? Susi and Bri both reflected on instances where gender-stereotypes have created challenges in gaining confidence to be leaders in majority-male organizations. SV: I’ve had experiences where leadership roles were stereotyped [based on gender]. In ASCE, the captain of the canoe building team is stereotyped to be a male student’s position because the project involves hands on manual labor.


BB: Sometimes student leader positions are gender stereotyped – for example, being the president of a club or leadership organization may be challenging because the club is male dominated. Susi also reflected on how the Resident Assistant position is beneficial is gaining confidence and respect as a student leader. SV: Because I’m an RA, students see me as a resource and leader on campus – people give respect to me in that role and thus it helps me gain confidence in other positions on campus. MH: What support structures does Wentworth for women student leaders? Both Bri and Susi shared positive experiences and support from women’s programming at WIT including, the Women’s Overnight Program, a Women’s Leadership Dinner, and the Annual Women’s Brunch. BB: The Women’s Overnight program was fantastic; I loved it. It was great to see so many women involved and gave me a good feel for Wentworth. I’m still friends with my host student. Also, the Women’s Leadership Dinner was pivotal in helping me make the decision to change my major. I made connections with upperclass women who helped give me advice and I’m still connected two year later. SV: I found it important to make connections with other women student leaders. I recently participated in the Student Leadership Institute and was impressed with the program and opportunity to meet other women student leaders. It’s great to make connections, continue to build connections, and learn from one another. MH: What is the role and value of mentoring for women student leaders? SV: [Mentoring] is important to keep women involved in organizations and positions. Susi explained the positive mentoring relationship in ASCE; upperclass women in the organization helped encourage her and helped her with building confidence to be a leader. The upperclass women gave her support and advice in the organization, with leadership development, her academic endeavors, and adjusting to college. Bri agreed and discussed similar mentoring relationships in her experiences. BB: I’m seen as a peer mentor in the facilities management program and on WEB (Wentworth Events Board). I was encouraged by other leaders and thought, this could be you someday! Bri also discussed the importance of being a mentor for aspiring women student leaders not just by being a mentor herself, but connecting students to other mentors and resources.


MH: How have your leadership experiences been influential in preparing you as a leader in STEM career field? BB: Being an RA is a lot like being a facilities manager; you don’t just take care of the building but also need to take care of the students who use the facility. My leadership experiences have been important in teaching me to push myself. I’m certainly confident in applying myself and managing others, regardless of differences in gender. Bri and Susi both shared their experiences as student leaders at a majority-male institution have given them much confidence in themselves and in pursuing their career aspirations. Susi explained how confidence in herself and as a leader is critical in preparing her for a career as a project management civil engineer. SV: My leadership experiences have helped me build confidence, discover my strengths, and grow as a better person and mentor. While there are many great takeaways from their reflections, I found it particularly interesting how Susi and Bri noted the value of building confidence as leaders and as women in STEM career fields. Bri and Susi also both reflected on the importance of formal opportunities, programs, and support as well as support through peer mentoring and a welcoming campus climate. For the 2013-2014 academic year, both Susi and Bri will serve as Senior Resident Assistants and will continue to be influential role-models and mentors at WIT. I look forward to continuing to learn and grow as a professional from our talented students.

30

Susana (Susi) Vasquez Trujillo is a Sophomore Civil Engineering student at Wentworth Institute of Technology. Susi currently serves as a Resident Assistant for the 610 Huntington Avenue Residence Hall and is the Vice President of the American Society of Civil Engineers (ASCE). Brianna (Bri) Bonfiglio is a Junior Facility Planning and Management student at Wentworth Institute of Technology. Bri currently serves as a Resident Assistant for the 610 Huntington Avenue Residence Hall, is the Treasurer of the Student Association of Facilities Management and the Public Relations Coordinator for the Wentworth Activities Board (WEB). Matt Heiser is the Resident Director for the 610 Huntington Avenue Residence Hall at Wentworth Institute of Technology. Matt is the Membership and Awards Coordinator for the NASPA Region I Student Leadership Programs Knowledge Community.


April Spotlight As Mr. John G. Aqno said “Leadership development is self-development�. The Offices of Student Involvement and Leadership and Community Engagement are committed to the holistic development of Rollins’ students. As such, the offices collaboratively facilitate a leadership development initiative in which engaged students are matched with allies. Over the past several years these two offices have engaged over 100 students in various leadership conferences or institutes. Each semester, student travelers return to campus with a renewed vigor and dedication to put their new skills to use in creating change. Many students experience success, and as a result we have seen new organizations, events, and initiatives emerge from these motivated students. The Leaderihip Ally Program is designed to focus on enabling students to fulfill an action plans and enact change projects. Each of the students participating in leadership conferences or institutes are paired up with a faculty or staff member who will serve as their ally in their leadership journey. Each ally/student pair meets at least twice a semester to support and challenge themselves along their individual paths. The Leadership Ally Program is dedicated to the concept that leadership can be practiced by everyone, and that all students can learn to become more effective leaders.

Meredith Hein Associate Director Office of Community Engagement

Profile for NASPA SLPKC

NASPA SLPKC - JUNE NEWSLETTER 2013  

NASPA SLPKC - JUNE NEWSLETTER 2013  

Advertisement