Organized Anarchies: 13 Steps to Building a Learning Organization

This post was originally published in the New England Journal of Higher Education

In many ways, higher education has not changed in the nearly 1,000 years since the first university was founded in Bologna, Italy in 1088. Many courses still have professors or “masters” lecturing in front of students, with exams being reproduction of facts learned in lectures. But in other ways, higher education changes daily. A brief perusal of headlines from the Chronicle of Higher Education or Inside Higher Ed demonstrates the changing landscape of colleges and universities. Given this perpetual state of chaos, colleges and universities not only need to withstand and manage change, they need to leverage that environmental change, and sometimes even foster change to better meet the needs of diverse stakeholders. To achieve their goals, colleges and universities need to become “learning organizations.”

Coined by Richard Pascal in the 1980s, the term learning organization was popularized by Peter Senge in his 1990 book The Fifth Discipline for organizations that can quickly adapt to shifts in the market and be successful. Senge wrote that learning organizations are “organizations where people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning to see the whole together.”

Thinking about organizations as learning organizations is a “systems thinking” approach, which requires consideration regarding how individuals interact with one another and with the structures of the organization. The components of this system must work in concert with each other, similar to the mechanisms of a clock or the organs of a human body, to achieve agreed-upon goals and outcomes.

(NEJHE, when it was known as Connection: New England’s Journal of Higher Education, explored “Learning Organizations” in this piece by James JF Forest, a Tufts University professor, then at the U.S. Military Academy at West Point.)

In addition to effectively adapting to change, there are a number of other benefits for learning organizations identified by Capilano University Director of Continuing Studies and Executive Education Karmen Blackwood, which include: increased innovation; nimble response to external pressures; increased pace of change; greater efficiency of resources; improved effectiveness; increased staff satisfaction; supportive team environment; leadership development at all levels of organization; a culture of inquiry, learning and knowledge-sharing.

Given the benefits of learning organizations, the next question is how to create a learning organization. Senge’s Five Disciplines provide the first insights into this question. The Five Disciplines are approaches for achieving a learning organization and can be viewed as steps to transforming organizations. Each is explained below.

The first discipline is personal mastery and is a commitment of employees to individual, self-directed learning so they may do their job well.

Mental models, or assumptions regarding the world and how the organization should work, must be made explicit and tested.

A shared vision created by individuals in the organization represents a collective purpose and goals and provides direction for a learning organization.

While individual learning is critical, team learning that is aligned with the shared vision and goals is vital to organizational success.

Finally, and perhaps most importantly, the organization needs to be understood as a system where each component part is interconnected and interdependent upon others.

A department of education at a college can serve as an example. Each faculty member in the department has a specialty such as special education, secondary education or higher education. For personal mastery, each faculty member would engage in their professional development to increase their skill and knowledge their unique discipline. Since each faculty member is approaching teaching and advising from the lens of their discipline regarding what may be the most important topic to teach or pedagogy to implement, it is important for everyone to explicitly state assumptions to increase communication. Once these mental models and assumptions are made clear, it will be easier for the group to construct a shared vision for educating and advising students from their own personal vision. Together, as a team, the faculty can engage in professional development that will help them achieve their shared goals. Developing the academic department into a learning organization this way requires the understanding that each component of the department is part of a system. No one person is sufficient to teach students and together leveraging the strengths of each member, the faculty can provide a holistic experience that can prepare students to be educational change leaders.

In addition to the five disciplines, there are additional characteristics that learning organizations demonstrate.

Change and disruption are embraced and celebrated, not resisted, because it is seen as an opportunity for improvement. Conflict between individuals is viewed as normal. It is inevitable that people will see problems differently and also have varied solutions to problems. Rather than being avoided or quashed, conflict is acknowledged and managed effectively, as growth can arise from conflict. There is an organizational commitment to continuous improvement and learning.

Individual and collective learning is ingrained in the fabric of the organization. It is valued, expected, and made time for. Feedback mechanisms are integrated into the programs and services. Assessment is not seen as an “add-on” activity, but rather a part of everyday practice. An example would be a new course-registration process. Rather than waiting for a semester or even a year to ask students, faculty and staff about the new process, data is gathered throughout it. Data regarding the length of time it takes students to register, the number of students who register on time, and even number of calls to the IT helpdesk could be reviewed prior to surveying users.

Data results are value-free. People are not penalized or judged for poor results. Instead, negative assessment results and feedback are regarded as opportunities for improvement. Failure is encouraged and celebrated as a natural step toward success. Individuals are encouraged to learn from failure by failing fast, failing often and failing forward.

A learning organization can be large or small. It can be a one-person office, a 25-person department, a division of offices, or even an entire college or university. Using the Five Disciplines and the additional characteristics of learning organizations as a foundation, specific steps can be identified to build an organization that is constantly learning and improving.

Here are 13 helpful steps …

  1. First, the organization must be viewed as a system of interconnected parts similar to the human body. Each part of the body depends on the others.
  2. Through a process that includes everyone in the organization from custodian to president, a shared vision must be created.
  3. The team has to be developed with the 4C’s: connection, community, cooperation, and collaboration. Team members must build relationships and trust so that they can work together effectively and efficiently.
  4. Intentional spaces and opportunities for inquiry, reflection and learning must be constructed to foster a culture of learning, but also to allow learning to actually occur. These opportunities may include group brainstorming and problem-solving sessions, personal learning and individual reflection time, or reading and discussion groups.
  5. Mental models must be challenged. These unconscious assumptions regarding how the organization should operate and how issues should be addressed must be made explicit without judgment so that ways of collaborating and problem-solving can be implemented.
  6. A shared language must also be created. The organization needs a common vocabulary with a glossary agreed upon by all members so they can communicate effectively and efficiently. Collective learning is difficult if words, terms and concepts are not understood by everyone.
  7. Developmental failure must be encouraged. Failure that leads to learning should be seen an essential step to success. To be most useful, failure must be done quickly and often to reap rewards.
  8. Prototyping is one approach to intentional, developmental failure. This form of pilot testing allows an organization to learn ways to address needs and solve problems without fully scaling up a product or service. This approach also provides the opportunity to understand interoperability and impacts of the new or revised product or service on other units in the organization. An example of prototyping would be an institution that wishes to implement a new educational program for all incoming students regarding alcohol abuse. Before implementing such a program to an entire incoming class, an office for alcohol and other drugs would develop and pilot test the new program with a small group of students. This pilot testing would provide an opportunity to ensure fidelity as well as understand any concomitant issues with resources and relationships with other campus offices.
  9. While closing the loop or making improvements is the most important step in assessment, not much thought is put into this step. For effective implementation of recommendations from assessment, making improvement should be viewed and treated as a change management process. Improvements do not happen easily. This type of change requires resource allocation/reallocation, changes in processes and practices and shifts in priorities.
  10. Collaborative capacity-building must be provided to staff across the organization. This training should be aligned with the shared vision and goals. It may be helpful to task a professional development committee with the development of a curriculum for the unit striving to become a learning organization.
  11. Feedback loops must be integrated within products and services. Assessment cannot be an activity that is completed at the end of a program or service. Feedback must be incorporated into the delivery of that program or service to provide ongoing data.
  12. Knowledge must be generated. Feedback provides data, but aggregated feedback needs to be synthesized into information that can be used by the organization.
  13. Individual and collective learning should be celebrated to reinforce the activity.

In their work A Garbage Can Model of Organizational Choice, scholars Michael D. Cohen, James G. March and Johan P. Olsen have called college and universities “organized anarchies.” In times of constant change like today, these organized anarchies become even more chaotic with multiple external pressures, multiple priorities and multiple stakeholders who hold multiple interests. Colleges and universities that are learning organizations with individuals and departments working in synchronicity and using data to make improvements toward a shared vision are more likely to be successful in a competitive market and achieve their goals. The product of colleges and universities is learning, and now they must use that product for their own success.

Posted in Uncategorized | Leave a comment

Assessment as Learning

NOTE: This was originally posted on here.

About 16 months ago our institution started an institutional assessment steering committee (IASC) as a vehicle to improve assessment on campus. Composed of representatives from administrative and academic units across campus, we have been engaging in a number of activities to help increase confidence and capacity in assessment and also begin to build a culture of assessment.

While we are providing training activities including a successful regional assessment conference in May 2017 and fun assessment games during an all campus meeting, building assessment processes and systems, and even investing in technology to help support this work, a culture of assessment is not emerging. Assessment still seems to be ad hoc and isolated. There appears to be some resistance to integrating assessment into practice. I believe this is due three things. First, assessment faculty and staff perceive assessment as a burden enacted by others, either the college administration or NEASC, our regional accrediting body. Assessment is s something we MUST do, not something we WANT to do. Another reason for resistance is the lack of understanding of the concept. Many believe that assessment is all about numbers and statistics, not about stories regarding how effective activities are. A final reason is an aversion to the term assessment, which likely relates to the feeling that assessment is being imposed.

To address these issues, the IASC has stopped using the term “assessment” and begun using “learning” in its place. We’ve realized that assessment really is just learning – learning if goals and outcomes are achieved and if the strategies being used are working.

Assessment isn’t rocket science. It is an inquiry process, beginning with a question. Questions could cover topics such as, are students learning how mitosis in my class, what are students learning by being a resident assistant, are the retention strategies working to increase 1st year to 2nd year retention. Once the question is asked, data can be gathered and analyzed to answer the question. Once the question is answered, then further action steps can be developed and implemented to address any shortcomings.

Thinking about assessment as learning helps demystify the concept, lessens aversion to the word, and decreases resistance paving the way for assessment to be activity freely engaged in on a daily basis as it is embedded into everyday practice.

Reframing the concept can make all of the difference.

Posted in Uncategorized | Leave a comment

Managing Change to Close the Loop

While there are variations, the basic assessment cycle comprises four steps: identify outcomes, develop strategies to achieve the outcomes, gather and interpret data related to outcome achievement, and close the loop. A great deal of emphasis is placed on the first three steps, but less on the final and perhaps most important step: closing the loop. Assessment is meaningless without putting the findings into action.

Closing the loop is relatively easy when significant resources aren’t needed to make the change or the person making the change has control over the implementation of recommendations. It’s not difficult for me to make a change in one of the programs I direct based on an assessment that I’ve done. Program improvements typically don’t take resources and I have power over the change process. However, implementing change is much more challenging when recommended improvement is an adjustment to organizational processes. For assessment to have impact at an organizational level, change management needs a part of closing the loop.

Below are five ways for managing change from assessment.

  1. Involve key stakeholders from the beginning of the assessment process. Change is much easier when individuals critical to implementing the change have a voice in the assessment process.
  2. Anticipate resistance. Reasons people resist change include perceived diminishment of power, disruption to status quo, or supposed lack of resources, as well as others. Anticipating hat resistance enables counter arguments to be proposed.
  3. Identify positive impact of the changes. Change will be easier to implement if positive outcomes can be identified. It is especially advantageous to articulate benefits to students, resource savings, and other improvements related to organizational mission.
  4. Identify strategies that will be employed to make change. Change must be an intentional and thoughtful process. Identifying strategies once the recommendation has been made will ease the process and demonstrate attentive commitment to improvement.
  5. Consider phasing in the change over time. Improving process requires time and energy (in addition to other resources). Change may be made easier if it is implemented over time in stages.

Assessment is more than collecting and analyzing data. The most important step is implementing the recommendations. Considering closing the loop as change management may improve the probability that improvement will occur and be sustained over time.

Posted in Uncategorized | Leave a comment

Assessment and Improvement Science

(NOTE: This blog posted was also posted on http://www.cas.edu/blog_home.asp?display=73)

Recently, I attended the Assessment Institute in Indianapolis. During a discussion with a friend she introduced me to the concept of improvement science. I had no idea there was such a field of study, so I immediately search Google to learn more. After reading a few online articles, I was intrigued enough to purchase for my flight how a copy of New Directions for Evaluation (2017), number 153 as the issue was dedicated to improvement science.

To those implementing assessment, the concept is familiar. According to Moen, Nolan, and Provost (as cited in Christie, Lemire, & Inkelas, 2017, p. 11), “improvement science is an approach to increasing knowledge that leads to an improvement of a product, process, or system. That certainly sounds like the continuous improvement purpose of assessment. What is a little different is that between improvement science explicitly focuses on systems thinking. Because of this systems approach, a central principle of this approach is change management, as it is critical for improvement (Christie, Lemire, & Inkelas, 2017).

The journal issue goes into detail regarding the components of improvement science, an operational model, and then case studies to illustrate what the authors are describing. But, there was one discussion point that jumped out to me. A cycle of testing and learning is foundational to improvement science (Lemire, Christie, & Inkelas, 2017). This cycle includes four steps: Plan, Do, Study, Act (PDSA).

Screen Shot 2017-11-08 at 8.08.55 AM

In this cycle, improvement occurs in small steps to see how effective implementation is and to better understand concomitant issues. The PDSA cycle is critical to change management as there are many issues that impact implementation of an improvement including user buy-in, resources, and sometimes even politics.

The PDSA cycle can easily be applied to assessment in higher education. Once the assessment is complete and recommendations for change are made, those changes should be implemented in small steps starting first with planning out the execution. After the planning takes place, a small-scale version of the improvement is implemented. Third, there would be assessment of that small-scale improvement. With this information, the improvement is scaled up.

Here is a basic example of how this might look in assessment practice. The orientation office at a small college completed a CAS self-study and learned that their program regarding sexual assault prevention did not achieve the learning outcomes intended. A recommendation in their self-study report is to contract with a professional organization that has developed a program called “Consent4U.” However, the program costs almost half of the entire orientation budget. While the benefits could be great, the orientation director wants to test the program first before making the significance financial investment. The director ran a pilot of the program with one of the residence hall learning communities. To understand change in learning, the director developed a pre- and post-test regarding sexual assault. In the first step of the cycle, “plan,” the director collaborated with the residence hall director to schedule a time and partnered with a faculty member in sociology to create the pre- and post-test. The program was implemented as part step 2, “do.” After the program was done, the director administered the post-test and “studied” the results. Based on the data, the director determined that given the amount of learning students obtained from this program, they were going to implement the Consent4U program and they requested additional funding from the provost for the program.

Some of the tenets of improvement science mirror those of assessment. However, the Plan, Do, Study, Act model may provide a way to manage the change that comes with making an improvement.

References
Christie, C., Lemire, S., & Inkelas, M. (2017). Understanding the similarities and distinctions between improvement science and evaluation. New Directions for Evaluation, 153, 11-22.

Lemire, S., Christie, C., & Inkelas, M. (2017). The methods and tools of improvement science. New Directions for Evaluation, 153, 23-34.

 

Posted in Uncategorized | Leave a comment

Essential Assessment Resources (Updated September 2017)

IMG_3256

When I talk with colleagues new to assessment or work with institutions, I am often asked about what resources are available.  I have aggregated the list below to answer some of those questions. Each resource serves a different purpose so I encourage you to review each item on the list to see how it may be helpful to you.

Web Resources

ACPA – Commission for Assessment and Evaluation

Association for Higher Education Effectiveness

Association for the Assessment of Learning in Higher Education

Council for the Advancement of Standards in Higher Education

Internet Resources for Higher Education Outcomes Assessment

NASPA – Assessment, Evaluation, and Research Knowledge Community

National Institute for Learning Outcomes Assessment

Student Affairs Assessment Leaders

 

Journals

Journal of Research and Practice in Assessment

Journal of Student Affairs Inquiry

Texts

Allen, K. R., Elkins, B., Henning, G. W., Bayless, L. A., &Gordon, T. W. (2013). Accreditation and the role of the student affairs professional. Washington, DC: ACPA-College Student Educators International. Available from http://www.myacpa.org/accreditation-and-role-student-affairs-educator

American College Personnel Association (ACPA). (2007). ASK standards: Assessment skills and knowledge content standards for student affairs practitioners and scholars. Washington, D. C.: Author.

Angelo, T. & Cross, K. (1993). Class assessment techniques: A handbook for college teachers. San Francisco: Jossey-Bass.

Banta, T. W., Lund, J. P., Black, K. E., Oblander, F. W. (Eds.) (1996). Assessment in practice: Putting principles to work on college campuses. San Francisco: Jossey-Bass.

Bingham, R. P., Bureau, D., & Garrison Duncan, A. (Eds.). (2015). Leading assessment for student success. Sterling, VA: Stylus.

Blimling, G. S. (2013). Challenges of assessment in student affairs. New Directions for Student Services, 2013(142), 5-14.

Bloom, B. S. (1968). Learning for mastery. UCLA: Center for the study of evaluation of instructional programs, (1)2. Los Angeles, CA.

Bourke, B. (2017). Advancing toward social justice via student affairs inquiry. Journal of Student Affairs Inquiry, 2(1).

Bresciani, M. J., Zelna, C. L., and Anderson, J. A. (2004). Assessing student learning and development: A handbook for practitioners. Washington, D.C.: National Association of Student Personnel Administrators.

Bresciani, M. J., Moore Gardner, M., Hickmott, J. (2010). Demonstrating Student success: A practical guide to outcomes-based assessment of learning and development in student affairs. Sterling, VA: Stylus.

Bresciani, M. J. (2011, August). Making assessment meaningful: What new student affairs professionals and those new to assessment need to know (NILOA Assessment Brief: Student Affairs). Urbana, IL: University for Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Council for the Advancement of Standards in Higher Education. Professional standards in higher education (August, 2015). 9th Ed. Washington, D.C.: Author.

Council for the Advancement of Standards in Higher Education (2006). Frameworks for assessing learning and development outcomes. Washington, D.C.: Author.

Culp, M. M. & Dungy, G. J. (Eds.). (2012). Building a culture of evidence in student affairs: A guide for leaders and practitioners. Washington, D.C.: National Association of Student Personnel Administrators.

Ewell, P. T. (2009, November). Assessment, accountability, and improvement: Revisiting the tension (NILOA Occasional Paper No.1). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Heiser, C., Prince, K., & Levy, J. (2017). Examining critical theory as a framework to advance equity through student affairs assessment. Journal of Student Affairs Inquiry, 2(1).

Henning, G. & Roberts, D. (2016). Student affairs assessment: Theory to practice. Sterling, VA: Stylus.

Keeling, R. (Ed.). (2004). Learning reconsidered. A campus-wide focus on the student experience. Washington, D.C.: National Association of Student Personnel Administrators & American College Personnel Association.

Keeling, R. P (Ed). (2006). Learning reconsidered 2: A practical guide to implementing a campus-wide focus on the student experience. Washington, D.C.: American College Personnel Association, Association of College and Housing Officers—International, Association of College Unions—International, National Academic Advising Association, National Association of Campus Activities, National Association of Student Personnel Administrators, & National Intramural—Recreational Sports Association.

Keeling, R. P., Wall, A. F., Underhile, R., and Dungy, G. J. (2008). Assessment reconsidered. Washington, D. C.: International Center for Student Success and Institutional Accountability.

Kuh, G. D., Ikenberry, S. O., Jankowski, N. A., Cain, T. R., Ewell, Hutchings, P., & Kinzie, J. (2015). Using Evidence of Student Learning to Improve Higher Education. San Francisco, CA: Jossey-Bass.

Maki, P. L. (2010). Assessing for learning: Building a sustainable commitment across the institution (2nd edition). Sterling, VA: Stylus.

Maki, P. L. (2017). Real-time student assessment: Meeting the imperative for improved time to degree, closing the opportunity gap, and assuring student competencies for the 21st century needs. Sterling, VA: Stylus.

Miller, M. A. (2012, January). From denial to acceptance: The stages of assessment (NILOA Occasional Paper No.13). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Montenegro, E., & Jankowski, N. (2017, January). Equity and assessment: Moving towards culturally responsive assessment (Occasional Paper No. 29). Urban, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment (NILOA).

Palomba, Catherine A., & Banta, Trudy W. (1999). Assessment essentials: Planning, implementing, improving. San Francisco: Jossey-Bass.

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research. San Francisco: Jossey-Bass.

Roberts, D., & Bailey, K. (Eds.). (2016). New Directions for Student Leadership: No. 151.  Assessing student leadership. San Francisco, CA: Jossey-Bass.

Schuh, J. (Ed.) (2013, Summer). Selected contemporary issues in assessment. (J. Schuh, Ed.) New Directions in Student Services 2013(142).

Schuh, J. M., Upcraft, M. L. and Associates. (2001). Assessment practice in student affairs: An applications manual. San Francisco: Jossey-Bass. 

Schuh, J. M., & Associates (2008). Assessment methods for student affairs. San Francisco: Jossey-Bass.

Schuh, J. H., & Gansemer-Topf, A. M. (2010, December). The role of student affairs in student learning assessment (NILOA Occasional Paper No.7). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Schuh, J., Biddix, J. P., Dean, L. A., & Kinzie, J. (2016). Assessment in student affairs (2nd ed.). San Franciso: Jossey-Bass.

Suskie, L. A. (2009). Assessing student learning: A common sense guide (2nd Ed). San Francisco: Jossey-Bass.

Suskie, L. A. (2014). Five dimensions of quality: A common sense guide to accreditation and accountability.  San Francisco: Jossey-Bass.

Timm. D. M., Davis Barham. J., McKinney, J., & Knerr, A. R. (2013). Assessment in practice: A companion guide to the ASK standards. Washington, DC: ACPA-College Student Educators International. Available from http://www2.myacpa.org/publications

Upcraft, M.L and Schuh, J.H. (1996). Assessment in student affairs: a guide for practitioners. San Francisco: Jossey-Bass.

Urdan, T. (2001). Statistics in plain English. Mahwah, N.J.: Lawrence Erlbaum Association.

Walvoord, B.E. (2004). Assessment clear and simple: A practical guide for institutions,      departments, and general education. San Francisco: Jossey-Bass.

Yousey-Elsenser, K., Bentrim, E., & Henning, G., (Eds). (2015). Coordinating student affairs divisional assessment: A practical guide. Sterling, VA: Stylus.

Zerquera, D., Pender, J., & Beruhen, J. (2017). Participatory action research as social justice framework for assessment in student affairs. The Journal of College and University Student Housing, 43(3), 14-27.

Posted in Assessment, Higher Education, Student Affairs | Leave a comment

Assessment as a Tool for Organizational Change

NOTE: This blog post is inspired by Margaret Leary. During the ACPA16 Convention she and I discussed ideas she had regarding how assessment can foster organizational change. That conversation me to learn more about organizational change and how assessment might be related to it. This blog post is a result of that research and contemplation.

Higher education is in a constant state of flux. This change is natural and to be expected. In the past, learning to adapt was sufficient to weather change. But, given the current state of endless unpredictability, that is no longer true. To survive today’s and tomorrow’s ever-evolving higher education landscape colleges and universities cannot simply adapt reactively to manage change or even proactively to facilitate it – they need to be designed for change (Kezar, 2014).

The structure of colleges and universities makes designing for change challenging. Institutions of higher education are intricate, multi-layered systems. The larger they are, the more complex. A concerted effort is required to reshape a college or university.

Organizational learning is one model for change management. This concept posits that for organizations to be designed for change, they must continually learn what is working and what is not. Thus, by providing staff and faculty with data, information, and inquiry methods they can solve problems to achieve organizational effectiveness. (Kuk, Banning, and Amey, 2014).

This is where assessment comes in. The two main purposes of assessment are accountability and improvement, with an overall goal of enacting change to increase effectiveness and efficiency. The last step in the assessment cycle, closing the loop, is critical. If data is not utilized for improvement, then assessment is not really being done (Henning & Roberts, 2016).

Building a culture of assessment fosters the shared beliefs, values, and behaviors that data should be used for decision making. When such a culture of exists, infrastructure, systems, and practices are in place enabling assessment to be easily embedded into daily practice. Staff then have tools to use data to understand how effective and efficient their programs and services are producing an organization that is constantly learning what is working and what is not. And, this is an organization designed for change.

Assessment can be a catalyst for organizational change.

References
Henning, G. & Roberts, D. (2016). Student affairs assessment: Theory and practice. Sterling, VA: Stylus.

Kezar, A. (2014). How colleges change: Understanding, leading, and enacting change. New York, NY: Routledge.

Kuk, L., Banning, J., & Amey, M. (2010). Positioning student affairs for sustainable change: Achieving organizational effectiveness through multiple perspectives. Sterling, VA: Stylus.

 

Posted in Uncategorized | Leave a comment

Four Stages of Assessment Competence

Originally published November 3, 2015 on The Student Affairs Collective

Assessment isn’t an activity. It’s a state of mind.

The statement above has been my mantra for the past five years. Too often, assessment is seen as an afterthought rather than an integral part of the program or service planning and implementation process.

Interestingly, this statement didn’t occur to me when I was in the midst of an assessment project, teaching class, facilitating a workshop, or consulting with a division of student affairs. It occurred to me in a grocery store. Yes, in the deli aisle waiting on maple and brown sugar ham, to be exact. I won’t bore you with the details here, but you can read more regarding this “evaluative epiphany” here: http://bit.ly/1MSqT2x.

In this post, I don’t want to talk about the statement itself, but what it represents. Recently, I echoed this refrain during a webinar sponsored by ACPA’s Commission for Assessment and Evaluation. The topic was “Assessing Cultures of Assessment.” (You can access the webinar here by entering your name and email) At the outset, I suggested that this statement, Assessment isn’t an activity. It’s a state of mind was a definitive sign that a culture of assessment existed in an organization. When assessment is a state of mind, it is infused into every aspect of individual or organizational practice including planning, implementation, and – of course – evaluation. At this stage, assessment becomes an unconscious, embedded element of everyday work.

So, how does one arrive at this destination where assessment is a state of mind?

I think the four stages of competence, originally described as the four stages of learning, and outlined by Linda Adams (2011), can be a helpful guide. These stages are

  • Unconscious incompetence
  • Conscious incompetence
  • Conscious competence
  • Unconscious competence

In the unconscious incompetence stage, people are not engaged in assessment and don’t know why it’s important. They are unable articulate its value or purpose. This stage described most of our field 10-15 years ago. The prevailing attitude was that assessment was simply a fad and we just needed to “wait it out.”

Once awakened to the both the necessity and benefit of engaging in assessment, people realize they need to it, but are not sure how to do it. This is conscious incompetence. The first step is to identify the actual skills and knowledge needed to perform assessment. The 2nd Edition of the ACPA/NASPA Professional Competencies (released in August 2015) provides a framework for skill and knowledge development regarding assessment, evaluation, and research. A great way to develop these skills is to seek out books, workshops, and other resources to improve skill and knowledge. Fortunately, more and more resources are available. There are four new student affairs assessment books coming out this academic year. How exciting! One centers on leading assessment for student success and the other focuses on coordinating divisional assessment. Two more covering more general assessment practice will come out in early March. Workshops and conferences are another way to build assessment competence. Each summer, ACPA holds its Student Affairs Assessment Institute and NASPA sponsors its Assessment and Persistence Conference. Both associations have special interest groups regarding assessment for networking and professional development. (ACPA’s Commission for Assessment and Evaluation and NASPA’s Assessment, Evaluation, and Research Knowledge Community). In addition, many master’s level preparation programs have assessment courses to help students move from this stage of competence to the next.

As people gain skill and knowledge, they become conscious competent. They value the assessment process and are continually attentive when performing it. Assessment takes effort and concentration at this stage but continues to become more comfortable and more frequent.

At the apex of this competence hierarchy is unconsciously competence. At this stage, assessment has been performed so often that it becomes habit or second nature and is integrated into daily practice and processes. It is important to note that learning should still take place. Even in assessment, lifelong learning is important as new scholarship is constantly being created.

As I mentioned at the beginning of this post, the statement Assessment isn’t an activity. It’s a state of mind came to me in a grocery store. In the deli aisle, I recognized that I was shopping in a way that mirrored the assessment process. My goal was to be both effective and efficient and those goals shaped by actions from planning to execution. Assessment had become so unconscious that it was integrated into other parts of my life. While, you may not be as much of a geek as me and want assessment to infiltrate your personal life, I do hope you develop your assessment skill and knowledge to the point that you are unconsciously competent.

At what stage are you? What can you do to get to the next level?

References
Adams, L. (2011). Learning a new skill is easier said than done. [Blog]. Retrieved from http://www.gordontraining.com/free-workplace-articles/learning-a-new-skill-is-easier-said-than-done/

Posted in Uncategorized | Leave a comment