Assessment as Learning

NOTE: This was originally posted on here.

About 16 months ago our institution started an institutional assessment steering committee (IASC) as a vehicle to improve assessment on campus. Composed of representatives from administrative and academic units across campus, we have been engaging in a number of activities to help increase confidence and capacity in assessment and also begin to build a culture of assessment.

While we are providing training activities including a successful regional assessment conference in May 2017 and fun assessment games during an all campus meeting, building assessment processes and systems, and even investing in technology to help support this work, a culture of assessment is not emerging. Assessment still seems to be ad hoc and isolated. There appears to be some resistance to integrating assessment into practice. I believe this is due three things. First, assessment faculty and staff perceive assessment as a burden enacted by others, either the college administration or NEASC, our regional accrediting body. Assessment is s something we MUST do, not something we WANT to do. Another reason for resistance is the lack of understanding of the concept. Many believe that assessment is all about numbers and statistics, not about stories regarding how effective activities are. A final reason is an aversion to the term assessment, which likely relates to the feeling that assessment is being imposed.

To address these issues, the IASC has stopped using the term “assessment” and begun using “learning” in its place. We’ve realized that assessment really is just learning – learning if goals and outcomes are achieved and if the strategies being used are working.

Assessment isn’t rocket science. It is an inquiry process, beginning with a question. Questions could cover topics such as, are students learning how mitosis in my class, what are students learning by being a resident assistant, are the retention strategies working to increase 1st year to 2nd year retention. Once the question is asked, data can be gathered and analyzed to answer the question. Once the question is answered, then further action steps can be developed and implemented to address any shortcomings.

Thinking about assessment as learning helps demystify the concept, lessens aversion to the word, and decreases resistance paving the way for assessment to be activity freely engaged in on a daily basis as it is embedded into everyday practice.

Reframing the concept can make all of the difference.

Posted in Uncategorized | Leave a comment

Managing Change to Close the Loop

While there are variations, the basic assessment cycle comprises four steps: identify outcomes, develop strategies to achieve the outcomes, gather and interpret data related to outcome achievement, and close the loop. A great deal of emphasis is placed on the first three steps, but less on the final and perhaps most important step: closing the loop. Assessment is meaningless without putting the findings into action.

Closing the loop is relatively easy when significant resources aren’t needed to make the change or the person making the change has control over the implementation of recommendations. It’s not difficult for me to make a change in one of the programs I direct based on an assessment that I’ve done. Program improvements typically don’t take resources and I have power over the change process. However, implementing change is much more challenging when recommended improvement is an adjustment to organizational processes. For assessment to have impact at an organizational level, change management needs a part of closing the loop.

Below are five ways for managing change from assessment.

  1. Involve key stakeholders from the beginning of the assessment process. Change is much easier when individuals critical to implementing the change have a voice in the assessment process.
  2. Anticipate resistance. Reasons people resist change include perceived diminishment of power, disruption to status quo, or supposed lack of resources, as well as others. Anticipating hat resistance enables counter arguments to be proposed.
  3. Identify positive impact of the changes. Change will be easier to implement if positive outcomes can be identified. It is especially advantageous to articulate benefits to students, resource savings, and other improvements related to organizational mission.
  4. Identify strategies that will be employed to make change. Change must be an intentional and thoughtful process. Identifying strategies once the recommendation has been made will ease the process and demonstrate attentive commitment to improvement.
  5. Consider phasing in the change over time. Improving process requires time and energy (in addition to other resources). Change may be made easier if it is implemented over time in stages.

Assessment is more than collecting and analyzing data. The most important step is implementing the recommendations. Considering closing the loop as change management may improve the probability that improvement will occur and be sustained over time.

Posted in Uncategorized | Leave a comment

Assessment and Improvement Science

(NOTE: This blog posted was also posted on http://www.cas.edu/blog_home.asp?display=73)

Recently, I attended the Assessment Institute in Indianapolis. During a discussion with a friend she introduced me to the concept of improvement science. I had no idea there was such a field of study, so I immediately search Google to learn more. After reading a few online articles, I was intrigued enough to purchase for my flight how a copy of New Directions for Evaluation (2017), number 153 as the issue was dedicated to improvement science.

To those implementing assessment, the concept is familiar. According to Moen, Nolan, and Provost (as cited in Christie, Lemire, & Inkelas, 2017, p. 11), “improvement science is an approach to increasing knowledge that leads to an improvement of a product, process, or system. That certainly sounds like the continuous improvement purpose of assessment. What is a little different is that between improvement science explicitly focuses on systems thinking. Because of this systems approach, a central principle of this approach is change management, as it is critical for improvement (Christie, Lemire, & Inkelas, 2017).

The journal issue goes into detail regarding the components of improvement science, an operational model, and then case studies to illustrate what the authors are describing. But, there was one discussion point that jumped out to me. A cycle of testing and learning is foundational to improvement science (Lemire, Christie, & Inkelas, 2017). This cycle includes four steps: Plan, Do, Study, Act (PDSA).

Screen Shot 2017-11-08 at 8.08.55 AM

In this cycle, improvement occurs in small steps to see how effective implementation is and to better understand concomitant issues. The PDSA cycle is critical to change management as there are many issues that impact implementation of an improvement including user buy-in, resources, and sometimes even politics.

The PDSA cycle can easily be applied to assessment in higher education. Once the assessment is complete and recommendations for change are made, those changes should be implemented in small steps starting first with planning out the execution. After the planning takes place, a small-scale version of the improvement is implemented. Third, there would be assessment of that small-scale improvement. With this information, the improvement is scaled up.

Here is a basic example of how this might look in assessment practice. The orientation office at a small college completed a CAS self-study and learned that their program regarding sexual assault prevention did not achieve the learning outcomes intended. A recommendation in their self-study report is to contract with a professional organization that has developed a program called “Consent4U.” However, the program costs almost half of the entire orientation budget. While the benefits could be great, the orientation director wants to test the program first before making the significance financial investment. The director ran a pilot of the program with one of the residence hall learning communities. To understand change in learning, the director developed a pre- and post-test regarding sexual assault. In the first step of the cycle, “plan,” the director collaborated with the residence hall director to schedule a time and partnered with a faculty member in sociology to create the pre- and post-test. The program was implemented as part step 2, “do.” After the program was done, the director administered the post-test and “studied” the results. Based on the data, the director determined that given the amount of learning students obtained from this program, they were going to implement the Consent4U program and they requested additional funding from the provost for the program.

Some of the tenets of improvement science mirror those of assessment. However, the Plan, Do, Study, Act model may provide a way to manage the change that comes with making an improvement.

References
Christie, C., Lemire, S., & Inkelas, M. (2017). Understanding the similarities and distinctions between improvement science and evaluation. New Directions for Evaluation, 153, 11-22.

Lemire, S., Christie, C., & Inkelas, M. (2017). The methods and tools of improvement science. New Directions for Evaluation, 153, 23-34.

 

Posted in Uncategorized | Leave a comment

Essential Assessment Resources (Updated September 2017)

IMG_3256

When I talk with colleagues new to assessment or work with institutions, I am often asked about what resources are available.  I have aggregated the list below to answer some of those questions. Each resource serves a different purpose so I encourage you to review each item on the list to see how it may be helpful to you.

Web Resources

ACPA – Commission for Assessment and Evaluation

Association for Higher Education Effectiveness

Association for the Assessment of Learning in Higher Education

Council for the Advancement of Standards in Higher Education

Internet Resources for Higher Education Outcomes Assessment

NASPA – Assessment, Evaluation, and Research Knowledge Community

National Institute for Learning Outcomes Assessment

Student Affairs Assessment Leaders

 

Journals

Journal of Research and Practice in Assessment

Journal of Student Affairs Inquiry

Texts

Allen, K. R., Elkins, B., Henning, G. W., Bayless, L. A., &Gordon, T. W. (2013). Accreditation and the role of the student affairs professional. Washington, DC: ACPA-College Student Educators International. Available from http://www.myacpa.org/accreditation-and-role-student-affairs-educator

American College Personnel Association (ACPA). (2007). ASK standards: Assessment skills and knowledge content standards for student affairs practitioners and scholars. Washington, D. C.: Author.

Angelo, T. & Cross, K. (1993). Class assessment techniques: A handbook for college teachers. San Francisco: Jossey-Bass.

Banta, T. W., Lund, J. P., Black, K. E., Oblander, F. W. (Eds.) (1996). Assessment in practice: Putting principles to work on college campuses. San Francisco: Jossey-Bass.

Bingham, R. P., Bureau, D., & Garrison Duncan, A. (Eds.). (2015). Leading assessment for student success. Sterling, VA: Stylus.

Blimling, G. S. (2013). Challenges of assessment in student affairs. New Directions for Student Services, 2013(142), 5-14.

Bloom, B. S. (1968). Learning for mastery. UCLA: Center for the study of evaluation of instructional programs, (1)2. Los Angeles, CA.

Bourke, B. (2017). Advancing toward social justice via student affairs inquiry. Journal of Student Affairs Inquiry, 2(1).

Bresciani, M. J., Zelna, C. L., and Anderson, J. A. (2004). Assessing student learning and development: A handbook for practitioners. Washington, D.C.: National Association of Student Personnel Administrators.

Bresciani, M. J., Moore Gardner, M., Hickmott, J. (2010). Demonstrating Student success: A practical guide to outcomes-based assessment of learning and development in student affairs. Sterling, VA: Stylus.

Bresciani, M. J. (2011, August). Making assessment meaningful: What new student affairs professionals and those new to assessment need to know (NILOA Assessment Brief: Student Affairs). Urbana, IL: University for Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Council for the Advancement of Standards in Higher Education. Professional standards in higher education (August, 2015). 9th Ed. Washington, D.C.: Author.

Council for the Advancement of Standards in Higher Education (2006). Frameworks for assessing learning and development outcomes. Washington, D.C.: Author.

Culp, M. M. & Dungy, G. J. (Eds.). (2012). Building a culture of evidence in student affairs: A guide for leaders and practitioners. Washington, D.C.: National Association of Student Personnel Administrators.

Ewell, P. T. (2009, November). Assessment, accountability, and improvement: Revisiting the tension (NILOA Occasional Paper No.1). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Heiser, C., Prince, K., & Levy, J. (2017). Examining critical theory as a framework to advance equity through student affairs assessment. Journal of Student Affairs Inquiry, 2(1).

Henning, G. & Roberts, D. (2016). Student affairs assessment: Theory to practice. Sterling, VA: Stylus.

Keeling, R. (Ed.). (2004). Learning reconsidered. A campus-wide focus on the student experience. Washington, D.C.: National Association of Student Personnel Administrators & American College Personnel Association.

Keeling, R. P (Ed). (2006). Learning reconsidered 2: A practical guide to implementing a campus-wide focus on the student experience. Washington, D.C.: American College Personnel Association, Association of College and Housing Officers—International, Association of College Unions—International, National Academic Advising Association, National Association of Campus Activities, National Association of Student Personnel Administrators, & National Intramural—Recreational Sports Association.

Keeling, R. P., Wall, A. F., Underhile, R., and Dungy, G. J. (2008). Assessment reconsidered. Washington, D. C.: International Center for Student Success and Institutional Accountability.

Kuh, G. D., Ikenberry, S. O., Jankowski, N. A., Cain, T. R., Ewell, Hutchings, P., & Kinzie, J. (2015). Using Evidence of Student Learning to Improve Higher Education. San Francisco, CA: Jossey-Bass.

Maki, P. L. (2010). Assessing for learning: Building a sustainable commitment across the institution (2nd edition). Sterling, VA: Stylus.

Maki, P. L. (2017). Real-time student assessment: Meeting the imperative for improved time to degree, closing the opportunity gap, and assuring student competencies for the 21st century needs. Sterling, VA: Stylus.

Miller, M. A. (2012, January). From denial to acceptance: The stages of assessment (NILOA Occasional Paper No.13). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Montenegro, E., & Jankowski, N. (2017, January). Equity and assessment: Moving towards culturally responsive assessment (Occasional Paper No. 29). Urban, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment (NILOA).

Palomba, Catherine A., & Banta, Trudy W. (1999). Assessment essentials: Planning, implementing, improving. San Francisco: Jossey-Bass.

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research. San Francisco: Jossey-Bass.

Roberts, D., & Bailey, K. (Eds.). (2016). New Directions for Student Leadership: No. 151.  Assessing student leadership. San Francisco, CA: Jossey-Bass.

Schuh, J. (Ed.) (2013, Summer). Selected contemporary issues in assessment. (J. Schuh, Ed.) New Directions in Student Services 2013(142).

Schuh, J. M., Upcraft, M. L. and Associates. (2001). Assessment practice in student affairs: An applications manual. San Francisco: Jossey-Bass. 

Schuh, J. M., & Associates (2008). Assessment methods for student affairs. San Francisco: Jossey-Bass.

Schuh, J. H., & Gansemer-Topf, A. M. (2010, December). The role of student affairs in student learning assessment (NILOA Occasional Paper No.7). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Schuh, J., Biddix, J. P., Dean, L. A., & Kinzie, J. (2016). Assessment in student affairs (2nd ed.). San Franciso: Jossey-Bass.

Suskie, L. A. (2009). Assessing student learning: A common sense guide (2nd Ed). San Francisco: Jossey-Bass.

Suskie, L. A. (2014). Five dimensions of quality: A common sense guide to accreditation and accountability.  San Francisco: Jossey-Bass.

Timm. D. M., Davis Barham. J., McKinney, J., & Knerr, A. R. (2013). Assessment in practice: A companion guide to the ASK standards. Washington, DC: ACPA-College Student Educators International. Available from http://www2.myacpa.org/publications

Upcraft, M.L and Schuh, J.H. (1996). Assessment in student affairs: a guide for practitioners. San Francisco: Jossey-Bass.

Urdan, T. (2001). Statistics in plain English. Mahwah, N.J.: Lawrence Erlbaum Association.

Walvoord, B.E. (2004). Assessment clear and simple: A practical guide for institutions,      departments, and general education. San Francisco: Jossey-Bass.

Yousey-Elsenser, K., Bentrim, E., & Henning, G., (Eds). (2015). Coordinating student affairs divisional assessment: A practical guide. Sterling, VA: Stylus.

Zerquera, D., Pender, J., & Beruhen, J. (2017). Participatory action research as social justice framework for assessment in student affairs. The Journal of College and University Student Housing, 43(3), 14-27.

Posted in Assessment, Higher Education, Student Affairs | Leave a comment

Assessment as a Tool for Organizational Change

NOTE: This blog post is inspired by Margaret Leary. During the ACPA16 Convention she and I discussed ideas she had regarding how assessment can foster organizational change. That conversation me to learn more about organizational change and how assessment might be related to it. This blog post is a result of that research and contemplation.

Higher education is in a constant state of flux. This change is natural and to be expected. In the past, learning to adapt was sufficient to weather change. But, given the current state of endless unpredictability, that is no longer true. To survive today’s and tomorrow’s ever-evolving higher education landscape colleges and universities cannot simply adapt reactively to manage change or even proactively to facilitate it – they need to be designed for change (Kezar, 2014).

The structure of colleges and universities makes designing for change challenging. Institutions of higher education are intricate, multi-layered systems. The larger they are, the more complex. A concerted effort is required to reshape a college or university.

Organizational learning is one model for change management. This concept posits that for organizations to be designed for change, they must continually learn what is working and what is not. Thus, by providing staff and faculty with data, information, and inquiry methods they can solve problems to achieve organizational effectiveness. (Kuk, Banning, and Amey, 2014).

This is where assessment comes in. The two main purposes of assessment are accountability and improvement, with an overall goal of enacting change to increase effectiveness and efficiency. The last step in the assessment cycle, closing the loop, is critical. If data is not utilized for improvement, then assessment is not really being done (Henning & Roberts, 2016).

Building a culture of assessment fosters the shared beliefs, values, and behaviors that data should be used for decision making. When such a culture of exists, infrastructure, systems, and practices are in place enabling assessment to be easily embedded into daily practice. Staff then have tools to use data to understand how effective and efficient their programs and services are producing an organization that is constantly learning what is working and what is not. And, this is an organization designed for change.

Assessment can be a catalyst for organizational change.

References
Henning, G. & Roberts, D. (2016). Student affairs assessment: Theory and practice. Sterling, VA: Stylus.

Kezar, A. (2014). How colleges change: Understanding, leading, and enacting change. New York, NY: Routledge.

Kuk, L., Banning, J., & Amey, M. (2010). Positioning student affairs for sustainable change: Achieving organizational effectiveness through multiple perspectives. Sterling, VA: Stylus.

 

Posted in Uncategorized | Leave a comment

Four Stages of Assessment Competence

Originally published November 3, 2015 on The Student Affairs Collective

Assessment isn’t an activity. It’s a state of mind.

The statement above has been my mantra for the past five years. Too often, assessment is seen as an afterthought rather than an integral part of the program or service planning and implementation process.

Interestingly, this statement didn’t occur to me when I was in the midst of an assessment project, teaching class, facilitating a workshop, or consulting with a division of student affairs. It occurred to me in a grocery store. Yes, in the deli aisle waiting on maple and brown sugar ham, to be exact. I won’t bore you with the details here, but you can read more regarding this “evaluative epiphany” here: http://bit.ly/1MSqT2x.

In this post, I don’t want to talk about the statement itself, but what it represents. Recently, I echoed this refrain during a webinar sponsored by ACPA’s Commission for Assessment and Evaluation. The topic was “Assessing Cultures of Assessment.” (You can access the webinar here by entering your name and email) At the outset, I suggested that this statement, Assessment isn’t an activity. It’s a state of mind was a definitive sign that a culture of assessment existed in an organization. When assessment is a state of mind, it is infused into every aspect of individual or organizational practice including planning, implementation, and – of course – evaluation. At this stage, assessment becomes an unconscious, embedded element of everyday work.

So, how does one arrive at this destination where assessment is a state of mind?

I think the four stages of competence, originally described as the four stages of learning, and outlined by Linda Adams (2011), can be a helpful guide. These stages are

  • Unconscious incompetence
  • Conscious incompetence
  • Conscious competence
  • Unconscious competence

In the unconscious incompetence stage, people are not engaged in assessment and don’t know why it’s important. They are unable articulate its value or purpose. This stage described most of our field 10-15 years ago. The prevailing attitude was that assessment was simply a fad and we just needed to “wait it out.”

Once awakened to the both the necessity and benefit of engaging in assessment, people realize they need to it, but are not sure how to do it. This is conscious incompetence. The first step is to identify the actual skills and knowledge needed to perform assessment. The 2nd Edition of the ACPA/NASPA Professional Competencies (released in August 2015) provides a framework for skill and knowledge development regarding assessment, evaluation, and research. A great way to develop these skills is to seek out books, workshops, and other resources to improve skill and knowledge. Fortunately, more and more resources are available. There are four new student affairs assessment books coming out this academic year. How exciting! One centers on leading assessment for student success and the other focuses on coordinating divisional assessment. Two more covering more general assessment practice will come out in early March. Workshops and conferences are another way to build assessment competence. Each summer, ACPA holds its Student Affairs Assessment Institute and NASPA sponsors its Assessment and Persistence Conference. Both associations have special interest groups regarding assessment for networking and professional development. (ACPA’s Commission for Assessment and Evaluation and NASPA’s Assessment, Evaluation, and Research Knowledge Community). In addition, many master’s level preparation programs have assessment courses to help students move from this stage of competence to the next.

As people gain skill and knowledge, they become conscious competent. They value the assessment process and are continually attentive when performing it. Assessment takes effort and concentration at this stage but continues to become more comfortable and more frequent.

At the apex of this competence hierarchy is unconsciously competence. At this stage, assessment has been performed so often that it becomes habit or second nature and is integrated into daily practice and processes. It is important to note that learning should still take place. Even in assessment, lifelong learning is important as new scholarship is constantly being created.

As I mentioned at the beginning of this post, the statement Assessment isn’t an activity. It’s a state of mind came to me in a grocery store. In the deli aisle, I recognized that I was shopping in a way that mirrored the assessment process. My goal was to be both effective and efficient and those goals shaped by actions from planning to execution. Assessment had become so unconscious that it was integrated into other parts of my life. While, you may not be as much of a geek as me and want assessment to infiltrate your personal life, I do hope you develop your assessment skill and knowledge to the point that you are unconsciously competent.

At what stage are you? What can you do to get to the next level?

References
Adams, L. (2011). Learning a new skill is easier said than done. [Blog]. Retrieved from http://www.gordontraining.com/free-workplace-articles/learning-a-new-skill-is-easier-said-than-done/

Posted in Uncategorized | Leave a comment

Moving From Serendipity to Intentionality in Student Learning

NOTE: This was originally posted to ACPA Developments Volume 13, Summer 2015 Issue (June 2015)

Please visit ACPA Video On Demand where I discuss this concept in a video interview.

On a Friday night in late April in 1987, during the spring of my sophomore year, I was attending a movie with friends in Brody Hall at Michigan State University. Back then, the Residence Hall Association screened movies in select lecture halls across campus. We didn’t have Netflix back then. We didn’t even have cable.

After the movie an event happened that changed my life forever.

On my way out of the lecture hall I ran into a friend of mine, Stacy Huffman, from my hometown of Saginaw, MI, who had also attending the movie. As friends who hadn’t seen each other in a while (it was a 21 acre campus of 30,000+ students), we caught each other up on our lives. As the semester was nearing an end, we discussed our summer plans in Saginaw. Stacy said that she was going to be staying on campus working for the Academic Orientation Program. She continued that they were looking for one more male orientation leader and encouraged me to apply. The interviews were the next day and there was an opening at 8am.

Wiping sleep from my eyes I got early, ate my Wheaties, and headed to the interview. A few days later I was notified that I had been selected as an orientation leader. Serendipity opened the door to my first student affairs job and my career – although I didn’t comprehend it at the time.

Fast forward to June of 1993. Frustration and anxiety was setting in because I had recently graduated with my master’s degree from MSU and, unlike my classmates, was still job searching. And searching. And searching. There had been a few phone interviews and even a couple of campus interviews, but nothing panned out. Self-doubt became all-consuming as I wondered why no one wanted to hire me despite what I thought were excellent grades and extensive experience.

When hope was waning, serendipity struck again.

I received a call from the Department of Residence Life at the University of New Hampshire for an on-campus interview. A hall director job at UNH was my “perfect” job from the start of my search. Unfortunately for me, a few weeks before, shortly after the phone interview, I was told that they had hired other individuals for their open spaces. However, a residence hall director had decided to leave UNH in June, opening up a position. I jumped at the chance for a campus interview. Two months later I was packing up a U-Haul to make the trip Durham, New Hampshire for my “perfect” job. This position created the foundation of who I am as an educator today. Plus, UNH was where I met my current partner. Call it destiny. Call it kismet. Or maybe it was just chance. But, it this result was certainly not intentional.

The orientation leader position began my career in student affairs but working for the Department of Residence Life at the University of New Hampshire changed how I approached my job. While I learned many things during my six years in residence life, one word has stuck with me and was the concept that compelled me into assessment work. That word is “intentionality.” During numerous staff and supervision meetings hall directors discussed how we were being intentional in our outreach to students and in our programming. Intentionality became a mantra for my work then and the driving force for the assessment work I have been undertaking the past 15 years.

Synonyms for serendipity include chance and accident while synonyms for intentional include designed, deliberate, and planned. While not antonyms of one another the concepts of serendipity and intentional are opposed to each other. I think serendipity vs. intentionality is a tension we continue to struggle with today in regard to learning in higher education. All too often we assume or think that learning is happening outside of the classroom and we aren’t doing as much as we can to intentionally foster it. We can no longer rely on serendipity to ensure student learning and success.

External demands for accountability are increasing the need for intentionality. The completion agenda dominates the national discourse of higher education. Students, parents, and legislatures are questioning the return on investment of a college education and want to known what students are learning after paying exorbitant amounts of money. College administrators are questioning the value of student affairs in an era of service provision where students are customers and clients. During a program session with college presidents at ACPA15, when asked what the priority should be for student affairs, all panelists stated that college student educators need to be able to demonstrate how they and their work positively contributes to student learning and retention. The completion agenda at the federal and state level is a major thrust behind the current accountability movement in higher education. This emphasis on retention is not simply because of the individual benefits for students who graduate but because of the financial impact of tuition revenue and state appropriations for colleges and universities.

More important than external calls for accountability are the internal calls for accountability that originate from inside each one of us as college student educators. We chose this profession for our careers because of the want and need to positively impact lives of college students. Thus, we strive to do the best job we can to assist students. As a field, we need to make intentionality an interwoven thread in the fabric of everyday practice to ensure student success, both academically and personally.

Intentionally isn’t rocket science. It can be explained in a four-step process outlined by Linda Suskie (2009). The first step is to begin with what you want students to achieve (aka outcomes which an be learning, operational, or program). Once outcomes are identified, existing literature and other evidence are used to identify strategies to foster those outcomes in step 2. Step 3 is to collect and analyze data to determine if the outcomes are achieved and how outcome achievement can be improved. The final step is the most important – closing the loop by making improvements. Intentionally is a process, not a destination.

Suskie Cycle

Figure 1. Assessment cycle by Suskie, L. (2009). Assessing student learning: A common sense guide (2nd Ed.). San Francisco: Jossey-Bass.

How Can ACPA Help You Be More Intentional
ACPA’s focus is student learning. As stated in our mission “ACPA supports and fosters college student learning through the generation and dissemination of knowledge, which informs policies, practices and programs for student affairs professionals and the higher education community” (ACPA, 2015). There are many ways ACPA can help you foster student learning, development, and success.

Individuals can leverage ACPA to help foster and support student learning by accessing the scholarship that is generated through various outlets. ACPA’s signature publication is the Journal of College Student Development. There are articles in each issue pertinent to faculty and practitioners alike. Previous issues include scholarship regarding experiences of Asian American and Latino/a students at an HBCU, the academic performance of Black emerging adults, a method to increase the grade point averages of fraternity members, and others.

About Campus is a scholarly publication directed towards practitioners. This bi-monthly magazine provides insights to improve practice in higher education. Past issues include articles addressing positive psychology, long-term success in work and life, as well as high impact practices.

In addition to these publications, and Developments which you are reading now, ACPA also sponsors books and monographs. This past year ACPA published Job One 2.0: Understanding the Next Generation of Student Affairs Professionals which focuses on the first jobs of college student educators as well as Working With Students in Community Colleges: Contemporary Strategies for Bridging Theory, Research, and Practice which provides approaches to help community college students be successful. Additional publications can be found here. ACPA books and monographs coupled with our other publications provide faculty and practitioners a library of research and scholarship to inform further research and practice.

Another major way that ACPA supports student learning, development, and success is helping college student educators across the spectrum of higher education bridge theory to practice. Some of this work is done through our acclaimed professional development events. Most of our activities are driven by curricula rooted in research. Upcoming events include

You can find additional professional development events here.

Theory to practice is also addressed in other venues including ACPA On Demand and Student Affairs Live, sponsored by ACPA. ACPA On Demand is collection of pre-recorded videos covering a variety of topics relevant to college student educators. Student Affairs Live is a weekly talk show viewed via GoogleHangouts covering critical emergent issues in higher education. Recorded versions of the shows are available the Student Affairs Live website and podcasts are available in iTunes.

As college student educators, we need to be much more intentional in how we cultivate student learning and development. While learning may happen by serendipity, we can’t rely on that. Our students – our future – are too important to rely on chance. ACPA is your go-to source for research, scholarship, and proven practices for fostering student success. Tap into the resources now!

References
ACPA (2015). Mission, vision, and values. Retrieved on April 8, 2015 from http://www.myacpa.org/values

Suskie, L. (2009). Assessing student learning: A common sense guide (2nd Ed). San Francisco: Jossey-Bass.

Posted in Assessment, Higher Education, Student Affairs, Student Learning | 1 Comment