experiential learning and assessment for today’s...

15
Linking Learning and Work Excerpt: 2016 CAEL Forum and News EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S LEARNER: The Link between Theory and Practice December 2016 | www.cael.org

Upload: others

Post on 16-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

Linking Learning and Work

Excerpt: 2016 CAEL Forum and News

EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S LEARNER:The Link between Theory and Practice

December 2016 | www.cael.org

Page 2: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the
Page 3: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

CONTENTSGETTING ASSESSMENT BACK ON THE RIGHT TRACK 1By George D. Kuh, Director, National Institute for Leaning

Outcomes Assessment, Adjunct Research Professor, University

of Illinois, Chancellor’s Professor of Higher Education Emeritus,

Indiana University

GETTING ASSESSMENT BACK ON THE RIGHT TRACK… WITH ADULTS 8Discussing George Kuh’s article on assessment with

Pamela Tate, CAEL’s President and CEO, and

Donna Younger, CAEL Senior Fellow

Page 4: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

©The Council for Adult & Experiential Learning, 2016 www.cael.org 1

GETTING ASSESSMENT BACK ON THE RIGHT TRACKBy George D. Kuh, Director, National Institute for Leaning Outcomes Assessment, Adjunct Research Professor, University of Illinois, Chancellor’s Professor of Higher Education Emeritus, Indiana University

Most educators find the phrase “student learning outcomes assessment” less than inspirational. Few outside the academy know what it means or why what it represents is important; and yet, the information assessment work produces—when done well—is essential in addressing some of the greatest challenges the country currently faces. Indeed, there are few processes in postsecondary education better suited to ensure that college graduates are prepared to contribute effectively to a robust national economy, to actively participate in civic affairs, and to live a self-sufficient, satisfying life.

It is both surprising and

disappointing that student learning

outcomes assessment has had, by

most accounts, limited influence in

informing education policy and guiding

institutional decision making.

With so much at stake, it is both surprising and disappointing that student learning outcomes assessment has had, by most accounts, limited influence in informing education policy and guiding institutional decision making. We can and must do better.

In this paper, I first briefly summarize the circumstances that inadvertently made assessment as a quality assurance mechanism less relevant to accomplishing its primary purposes—documenting student learning and improving student and institutional performance. Then, I discuss three promising trends that are refocusing efforts to enhance learning that will both increase student engagement in educationally purposeful tasks as well as generate

information to guide improvement efforts and satisfy quality assurance interests of external authorities. These ideas are based in large part from the work of the National Institute for Learning Outcomes Assessment (NILOA) and are explicated in greater detail in Using Evidence of Student Learning to Improve Higher Education (Kuh, Ikenberry, Jankowski, Cain, Ewell, Hutchings, & Kinzie, 2015a).

How the Assessment Movement Lost Its Way: The CliffsNotes Version

When the definitive history of the assessment movement is written, it will almost certainly explain how and why student learning outcomes assessment took a wrong turn shortly after its importance was declared as part of a series of education reform reports in the mid-1980s. The first alarm was sounded in the 1983 landmark report from the National Commission on Excellence in Education, A Nation at Risk, which was aimed at mobilizing policy makers and educators to make dramatic improvements in the quality of educational institutions.

The following year, the postsecondary community specifically was called out by Involvement in Learning (National Institute of Education, 1984), a report that featured assessment of student learning as one of the three major areas demanding immediate attention. The other two themes were establishing higher expectations for student performance and involving students more actively in their education. As the Involvement in Learning authors emphasized, assessment is especially important, because it was a key to documenting progress in the other two priority areas and pointing to where improvements were most needed:

…assessment can be used to increase student involvement and to clarify expectations if it is designed to measure improvements in

Page 5: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

2016 CAEL Forum & News 2

performance, and if the information so gathered is fed back to students, faculty, and administrators as the basis for making changes in individual effort, program content, and instructional methods (p. 33).

Note that the main goal of assessment was to guide improvement:

We thus believe assessment to be an organic part of learning. The use of tests to sort and screen is legitimate for professional certification and licensing and, indeed, for any operation where selection is necessary. But assessment has even greater potential as a tool for clarifying expectations and for increasing student involvement when it is used to measure improvements in performance (p. 53).

Within a few years, though, the intended animating purpose of assessment put forth in Involvement in Learning—to improve learning—was supplanted by doing something under the name of assessment to respond to various compliance requirements of external agencies, such as regional accreditors, state agencies, and other entities with a legitimate accountability agenda (National Institute for Learning Outcomes Assessment [NILOA], 2016). Other forces also pulled assessment away from improvement, not the least of which was the absence of institutional infrastructures to systematically collect and figure out how to effectively use information about student learning.

The market was quick to respond, as for-profit vendors developed tools for measuring student and institutional performance. Soon after, the regional accreditors began to require colleges and universities to incorporate data about the quality of the student experience in institutional self-studies, a point of increased emphasis that escalated over time and continues today. As a result, the number of institutions doing assessment spiked, as did the number of assessment tools and approaches from which to choose. For example, Borden and Kernel (2013) reported that only 26 assessment tools were available in 2000, a number that mushroomed to 250 by 2009. Today, institutions must not only regularly collect data about collegiate quality, they must also

demonstrate how they are using assessment results. So it is no surprise that accreditation continues to be the most important factor driving assessment work on college campuses (Kuh, Jankowski, Ikenberry, & Kinzie, 2014).

Despite its inauspicious start, the assessment movement has made progress. For example, the vast majority—at least 85%—of regionally accredited colleges and universities now publicly state the intended learning outcomes of their degree programs, and dozens of schools can point to how assessment data have stimulated changes in policies and programs.

Yet, most observers agree that far too little has been achieved in terms of demonstrable improvements in teaching and learning. Granted, there are plausible explanations for the limited progress, not the least of which is that documenting learning and using that evidence to improve student and institutional performance is a challenging, complicated process.

The procedures and

instruments many colleges and

universities use to represent

student learning are not well

suited to adequately capture

the knowledge and skills faculty

want their students to acquire.

Another non-trivial mitigating factor is that the two important groups who have the most to gain from doing assessment right (faculty and students) are often under-involved in the approaches many institutions use to gather evidence about student performance. This is opposite of what the Involvement in Learning (1984) authors recommended:

The best way to connect assessment to improvement of teaching and learning is to ensure that faculty have a proprietary interest in the assessment process. Unlike practice in some other nations, the American system does not rely on third-party testing. In our system,

Page 6: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

©The Council for Adult & Experiential Learning, 2016 www.cael.org 3

the person who teaches is the same person who certifies student learning. It makes sense to ask that person to be involved in every step of an assessment program. We believe that such involvement will help faculty to specify far more precisely than they do at present the outcomes they expect from individual courses and academic programs. And the more precisely they can specify the outcomes, the more likely they are to match teaching approaches to those ends. (p. 58)

Herein lies the rub. According to provosts, what their institution needs the most to move the assessment agenda forward is to actively engage more faculty in collecting meaningful, actionable information about student accomplishment (NILOA, 2016. Equally important, provosts also say the most useful information for improving teaching and learning is from classroom-based assessments. Indeed, as Hutchings (2010) put it, authentic faculty engagement is the “gold standard” for good assessment work. (p. 6)

Unfortunately, too many faculty members have come to view assessment as an obligatory compliance activity to appease the interests and demands of external entities, such as accreditors. In addition, the procedures and instruments many colleges and universities use to represent student learning are not well suited to adequately capture the knowledge and skills faculty want their students to acquire. While many of these tools are psychometrically sound, they tend to reduce complex, creative processes to context-free numerical indicators that are difficult to understand, even by faculty members who are expert in the subject matter and proficiencies the indicators are intended to represent.

Among the more important goals of the postsecondary enterprise is to engage students in ways that foster a genuine love of inquiry and ensure that students acquire the analytical reasoning and other higher order proficiencies demanded by the 21st century. The assessment challenge is to gather relevant, actionable evidence of the extent to which students have, indeed, acquired those proficiencies in ways that preserve their complexity and present this information to faculty, staff,

students, employers, and others in meaningful, understandable forms.

One of the reasons the impact of the student learning outcomes assessment movement has been limited is because too often the information produced by such efforts are based on national or local surveys about students’ experiences or the scores on standardized tests based on small, random samples of students, the results of which are aggregated and reported at the institutional level. Thus, the data do not represent student performance in specific classes or assignments crafted by individual faculty. As such, these kinds of findings too often do not answer questions or speak to issues that faculty and staff consider relevant for working with their students or yield information that they or others can use to be more effective. Students themselves are often puzzled when given tests seemingly unrelated to their program of study and about which their teachers exhibit little confidence or enthusiasm.

ASSESSMENT

Back to the FutureThe good news is that many colleges and

universities are digging out of this quagmire by endorsing and supporting both faculty members and students to enact what the Involvement in Learning Study Group urged three decades ago (1984). That is, put students at the center of assessment work by conceiving it as an engaging pedagogical strategy embedded in the regular work faculty do as they assemble evidence of learning from student performance on assignments intentionally designed

Page 7: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

2016 CAEL Forum & News 4

to elicit the desired outcome(s). Such a strategy requires that students systematically reflect on what they are learning through dealing with problems and tasks that make them more thoughtful and self-aware and that help them connect their experiences over time and across contexts.

Among the various engaging pedagogies gaining traction are three promising approaches that put students at the center of assessment work and promise to put assessment back on the right track. They are assignment design, rubrics, and portfolios.

Assignment DesignAs noted earlier, the most useful evidence

of student learning comes from course-based assessments embedded in regular assignments. Indeed, our NILOA surveys show that faculty-designed assignments are the primary vehicle through which students demonstrate that they know and can do what the institution or program specifies; they are used far more frequently than standardized tests. Assignments and the opportunities for feedback they create are among a teacher’s most powerful pedagogical tools, signaling to students the content and skills that matter most to what they are expected to know and be able to do. Indeed, no other transaction between students and teachers has as much promise to help students attain essential 21st century learning proficiencies than an intentionally designed assignment aligned with one or more desired outcomes (Hutchings, Jankowski, & Schultz, 2016).

Insofar as those skills and content align both with course goals and program and institutional learning outcomes, assignments are also the primary mechanism for building the foundation for higher-level outcomes, making it possible to scaffold the learning experience across the curriculum and creating more coherent pathways to student success. Moreover, student responses to well-constructed assignments provide direct evidence of learning. Put simply, if assignments do not elicit demonstrations of intended learning outcomes, efforts to improve student learning will be for naught.

To encourage faculty to get practice in crafting assignments directly tied to one or more specified outcomes, NILOA has hosted seven (so far) assignment design workshops. The assignments developed or

refined during these working sessions are linked to one or more DQP proficiencies, peer reviewed, and posted to an online library (http://assignmentlibrary.org/) which now contains more than 40 papers, projects, demonstrations, reports, and other tasks from a variety of different subject areas. The Association of American Colleges & Universities (AAC&U), drawing on its Essential Learning Outcomes, is sponsoring parallel work on “signature assignments” through its partnership with the State Higher Education Executive Officers coordinating the Multi-State Collaborative (MSC). Faculty participating in such efforts design assignments for individual courses and also work to sequence assignments and incorporate high-impact practices across a program of study and across transfer pathways (Kuh, 2008). Individual campuses are now beginning to implement their own assignment design workshops.

RubricsThe assignment design experience is a reminder

that most instructional staff—whether teaching online, on campus, or in some blended form—routinely perform certain kinds of instructional tasks—features also common to a good assessment program. That is, faculty members establish and articulate learning objectives on the course syllabus or outline, select relevant subject matter resources with which students should be familiar, assign tasks that require students to demonstrate what they have learned, and evaluate student performance and progress via feedback about their performance on assignments using grades or other mechanisms.

Page 8: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

©The Council for Adult & Experiential Learning, 2016 www.cael.org 5

So it would seem that most instructors to varying degrees use formal and informal practices in their classrooms, laboratories, and studios that generate some of the best evidence of authentic student learning. Some observers argue these kinds of faculty judgments about student performance are subjective and cannot adequately represent and assure quality learning across a program of study or at the institutional level. Indeed, this was one of the concerns that fueled the resurgence of so-called standardized student learning outcomes measures back in the 1980s and 1990s. But recent developments have renewed interest in approaches to credible, trustworthy approaches to document authentic student achievement.

When explained to students, rubrics

can also clarify expectations,

provided the language used

avoids educational jargon so that

they are understandable and

comprehensible to students.

The best example is the 16 AAC&U VALUE (Valid Assessment of Learning in Undergraduate Education) rubrics, developed by teams of faculty from across the country to help gauge performance on a wide range of outcomes (from writing to integrative learning, quantitative literacy to teamwork). They have been used both in individual courses and institutions as well as in proof-of-concept experiments across dozens of colleges and universities in multiple states. According to provosts, the use of rubrics has risen significantly, from 23% in 2009 to almost 70% in 2013 across different types of two- and four-year institutions (Kuh, Jankowski, Ikenberry, & Kinzie, 2014).

For faculty, rubrics and scoring guides are vehicles for more clearly explicating and raising awareness of the different levels of performance students may move through from beginning to more advanced proficiency. When explained to students, rubrics can also clarify expectations, provided the

language used avoids educational jargon so that they are understandable and comprehensible to students.

The VALUE rubrics as well as other locally developed authentic assessment tools and scoring guides are most powerful when they are adapted to the respective campus educational purposes and related circumstances as well as closely aligned with essential 21st century proficiencies, such as those articulated on the Degree Qualifications Profile (Lumina Foundation, 2014).

So, using rubrics and scoring guides are examples of how to standardize the manner in which authentic student learning can be measured without standardizing what, where, or how.

ePortfoliosMore than three-fifths of all undergraduate

students today attend two or more institutions on their way to the baccalaureate degree—often starting, stopping out for a time, and starting again, sometimes at a different place. These swirling patterns of enrollment underscore the need for giving students practice in connecting and making meaning of their various learning experiences—across courses and disciplines, between what they do in the classroom and in their lives outside, and over time. Bridging and integrating these learning experiences are precisely what ePortfolios are designed to do.

ePortfolios make student learning

visible—to students themselves, to

their peers and faculty, and to

external audiences.

Student ePortfolios are web-based, curated student-generated collections of learning artifacts (papers, multi-media projects, speeches, images, etc.) and related reflections to demonstrate learning and growth inside and outside the classroom. As with classroom-based assessments, the use of student portfolios has grown, with about three-fifths (57%) of campuses now using this approach for some purpose (Kuh, Jankowski, Ikenberry, & Kinzie, 2014; Eynon & Gambino, in press).

Page 9: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

2016 CAEL Forum & News 6

When well designed and implemented, ePortfolios are much more than a record-keeping platform. With the assistance of prods and comments from faculty, staff, and peers, the ePortfolio helps make learning visible to students themselves, to their peers and faculty, and to external audiences. As a result, the process induces students to develop deeper, more meaningful understandings of how and why their learning has changed how they think and how they can transfer what they have learned to different settings and circumstances.

The evidence contained in portfolios has several attractive advantages. As with classroom-based assessments, portfolios represent authentic student accomplishment that draws on and integrates the work students do in regular course activities and assignments. As such, they connect assessment to the ongoing work of teaching and learning and reflect directly and indirectly the work of faculty. In addition, because they contain artifacts of student performance over time (not just a snapshot), they can add coherence to and capture more complex and nuanced evidence of learning (Banta, Griffin, Flateby, & Kahn, 2009).

Recent evidence from a 24-campus national project shows that ePortfolios are positively associated with student persistence and provide students with structured opportunities to step back from their work and reflect on and talk with peers and mentors about how their thinking, skills, and dispositions have evolved over time (Clark & Eynon, 2011/2012; Eynon & Gambino, in press). Additionally, ePortfolios are portable, in that students can incorporate new accomplishments during and after college, whenever and wherever the significant learning occurs.

Employers, too, find it easier to grasp and determine whether a student knows and can perform the range of tasks common to those they will encounter in the workplace when portfolios include detailed reports about a student’s actual performance in an internship or coop, or read how a student conceptualized a project and carried it through to the end. Indeed, virtually everyone prefers these forms of evidence of authentic student learning to a single or even a combination of test scores.

It is hard to imagine a more effective way to demonstrate to students what they have gained from

college and for students to show evidence to others of what they know and can do.

Final ThoughtsDocumenting student learning and using this

information to improve student and institutional performance are essential to ensure that students are prepared to meet the challenges of the 21st century. Too often, though, students—the group that could benefit the most from an effective assessment program—are left out of the loop. At the same time, too few faculty members are actively involved in what have become commonplace approaches to assessment, resulting in a substantially constricted understanding about the quality of teaching and learning campus wide.

As assessment authorities emphasize, no single tool or approach is adequate to capture the range and depth of what students need to know and be able to do to survive and thrive in a dynamic global economy. Individually, the three approaches described in this paper hold great promise for putting the responsibility and authority for assessment back in the hands of faculty where it rightly belongs. But their greatest value lies in faculty and staff using them in complementary ways along with other approaches to document the outcomes of engaging pedagogies.

Relatively few faculty members have experience designing clear, explicit course and program outcomes or assignments that directly elicit desired outcomes. Not surprisingly, campuses that have made the most progress have invested in serious, sustained professional development and have hosted venues where faculty and others come together to formulate and explore questions about their students’ learning. Indeed, in many instances it may be better to spend less time talking about “assessment” per se and focusing instead on the activities that achieve its purposes: interactions between students and those who interact directly with them—faculty, student affairs staff, librarians, and others in educative roles—undertaken in a spirit of inquiry, evidence-gathering, reflection, deliberation, planning, and action. This kind of inquiry can move outcomes assessment from what many have come to consider a compliance-laden “add on” to the real work of the academy to what assessment was originally intended to be—a process that is integral to ensuring effective teaching and learning.

Page 10: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

©The Council for Adult & Experiential Learning, 2016 www.cael.org 7

REFERENCESAssociation of American Colleges and Universities (AAC&U).

(2008). Our students’ best work: A framework for accountability worthy of our mission (2nd ed.). Washington, DC: Author.

Banta, T. W., Griffin, M., Flateby, T. L., & Kahn, S. (2009, December). Three promising alternatives for assessing college students’ knowledge and skills (NILOA Occasional Paper No. 2). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Borden, V. M. H., & Kernel, B. (2013). Measuring quality in higher education: An inventory of instruments, tools, and resources. Retrieved from http://apps.airweb.org/surveys/Default.aspx

Clark, J. E., & Eynon, B. (2011/2012). Measuring student progress with e-portfolios. Peer Review, 13(4)/14(1), 6–8.

Ewell, P. (2013, January).The Lumina Degree Qualifications Profile (DQP): Implications for Assessment. (Occasional Paper No.16). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Eynon. B., & Gambino, L. (Eds.) (in press). High impact ePortfolio practice: A catalyst for student, faculty, and institutional learning. Sterling, VA: Stylus.

Hutchings, P. (2010, April). Opening doors to faculty involvement in assessment (Occasional Paper No. 4). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Hutchings, P. (2016, January). Aligning educational outcomes and practices. (Occasional Paper No. 26). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Hutchings, P., Jankowski, N. A., & Ewell, P. T. (2014). Catalyzing assignment design activity on your campus: Lessons from NILOA’s assignment library initiative. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Hutchings, P., Jankowski, N. A., & Schultz, K. E. (2016, January-February). Designing effective classroom assignments: Intellectual work worth sharing. Change, 48(1), 6–15.

Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Washington, DC: Association of American Colleges and Universities.Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcome assessment in American higher education. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Kuh, G. D., Ikenberry, S. O., Jankowski, N., Cain, T. R., Ewell, P. T., Hutchings, P., & Kinzie, J. (2015a). Using evidence of student learning to improve higher education. San Francisco: Jossey-Bass.

Kuh, G. D., Ikenberry, S. O., Jankowski, N., Cain, T. R., Ewell, P. T., Hutchings, P., & Kinzie, J. (2015b). Making assessment matter: Moving beyond compliance. Change, 47(5), 6–14.

Kuh, G.D., Jankowski, N., Ikenberry, S.O., & Kinzie, J. (2014). Knowing what students know and can do: The current state of student learning outcomes assessment in U.S. colleges and universities. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Lingenfelter, P. L. (2105). “Proof,” policy, and practice: Understanding the role of evidence in improving education. Sterling, VA: Stylus.

Lumina Foundation. (2014). The degree qualifications profile 2.0. Indianapolis, IN: Author.

National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: Author.

National Institute of Education. (1984, October). Involvement in learning: Realizing the potential of American higher education (Final report of the Study Group on the Conditions of Excellence in American Higher Education). Washington, DC: Author.

National Institute for Learning Outcomes Assessment. (2016, May). Higher education quality: Why documenting learning matters. Urbana, IL: University of Illinois and Indiana University, Author.

Rhodes, T., & Finley, A. (2013). Using the VALUE rubrics for improvement of learning and authentic assessment. Washington, DC: Association of American Colleges and Universities.

Page 11: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

2016 CAEL Forum & News 8

GETTING ASSESSMENT BACK ON THE RIGHT TRACK… WITH ADULTSDiscussing George Kuh’s article on assessment with Pamela Tate, CAEL’s President and CEO, and Donna Younger, CAEL Senior Fellow

George Kuh’s article, “Getting Assessment Back on Track,” discusses the history of assessment in higher education and the growing recognition that assessment can and should be “an engaging pedagogical strategy embedded in the regular work faculty do.” CAEL’s Becky Klein-Collins recently sat down with CAEL President and CEO Pamela Tate and CAEL Senior Fellow Donna Younger to discuss the article and its relevance for the use of assessment in the adult learner context.

Assessment can help the person

value what they already know in a

new way. Ultimately, this process

helps with overall student success

because self-awareness, reflection,

and motivation all meld for the

adult learner.

Assessment and Learning ProcessQuestion: Kuh says early on that many decades

ago, the main goal of assessment was “to guide improvement” and that assessment was seen as an organic part of learning. That focus changed dramatically in the intervening years, and now we are seeing a return to those concepts. Why do you think the role of assessment is important for adults who are pursuing postsecondary credentials? Why is connecting assessment to the learning process important for the student who is coming to higher education with many years of work and life experience?

Donna Younger: Adults often come back to college either really rusty or they’ve never experienced college and don’t know what to expect. It can feel like a foreign country to them. One of the things that assessment does—in all of its manifestations, including admissions, placement, prior learning assessment, and in the context of a course—is communicate to the student what higher education values and pays attention to. So, assessment is not just about giving information to the institution or the faculty member about the student. It’s also about giving information to the student. They learn what higher education values, and they also learn about their own capabilities. If they’re trying to figure out if they fit and what their prospects are for success or where they are likely to shine and where they might not, assessment helps them plant their feet. It tells them what they’ll need to do to ensure their success or to capitalize on their strengths.

Pamela Tate: Right, and that gets to Kuh’s point about how assessment helps the student with self-awareness in terms of connecting their experiences over time and across contexts. This idea makes me think about the role of reflection in learning. When assessment is done well, it should propel the adult learner to reflect on what they already know and then on what they still don’t know or would like to learn. When assessment is done well, it’s connected to the learning process and isn’t just some after-the-fact, check-the-box thing. Kuh’s comment also reminds me of the importance of reflection to the student’s motivation, especially in the case of prior learning assessment. It is a motivating prospect for the student to realize that what they know is college-level learning and isn’t just something they happened to learn on the job but has no real value. In this case, assessment can help the person value

Page 12: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

©The Council for Adult & Experiential Learning, 2016 www.cael.org 9

what they already know in a new way. Ultimately, this process helps with overall student success because self-awareness, reflection, and motivation all meld for the adult learner.

DY: Yes, the connection between reflection and learning in the assessment process is really profound, Pam. It’s probably obvious to those who live in the world of adult learning, but not everyone knows this. In a lot of institutions, higher ed is still very instructor driven and instructor focused. So, I think it’s important to make the point about the power of student reflection in the assessment process and give them ownership. The institution not only needs to create the opportunity for reflection through assessment, it must also be proactive in urging students to reflect by giving them either the space to do it or the questions that guide it. Some students are self-starters when it comes to being reflective and to making meaning out of things, but some aren’t. Institutions really need to be players in making this kind of deep learning happen.

CBE and PLA are both learning-

centered and student-centered

processes. They’re not faculty

centered.

PLA and CBEQuestion: Why do you think assessment is

such an important part of how CBE and PLA change the way higher education is experienced by the adult learner?

PT: CBE and PLA are both learning-centered and student-centered processes. They’re not faculty centered. In CBE and PLA, there is much more emphasis on guiding the students and the students’ learning than there typically seems to be in higher education, at least to me.

DY: I spend a lot of time with faculty, and the thing that CBE and PLA both do is demonstrate that higher education doesn’t exist in a vacuum, that learning is meaningful when it resonates with context outside of higher ed, and that success doesn’t need

to be measured only against standards and measures generated by the institution but on standards and measures that come from beyond the walls of the institution. I’m reminded so often about how powerful Change magazine’s “From Teaching to Learning” article was (Barr & Tagg, 1995). It’s over 20 years old now, but there’s still a great deal to pay attention to. It explained how so much of higher ed is predicated on creating a structure for teaching and puts the reigns in the hands of the faculty and the institution. In contrast, CBE and PLA expand that responsibility to the student. Both PLA and CBE are more student driven, and that’s really profound.

Take, for example, portfolio assessment. In that context, the voice of the student that comes through a portfolio is more important than the framing of test questions by a faculty member. The student makes choices about what they highlight and what they don’t. When they take a test or they perform some other form of assessment, they’re responding to what a faculty member or someone else in the institution has framed from their perspective. I think that CBE and PLA, while they’re often presented as creating “efficiencies” in higher ed, also have such amazing potential in reframing the role of both faculty and students in education and in the learning enterprise.

RubricsQuestion: Kuh talks about the importance

of using rubrics and how they “are vehicles for more clearly explicating and raising awareness of the different levels of performance students may move through from beginning to more advanced proficiency.” How do you see rubrics, or rubric-like tools, playing a role in new approaches to credentials for adults? Why is it that they have come to be so important for PLA and CBE?

PT: In our experience with LearningCounts, we have seen rubrics as a critical way to guarantee a rigorous standardized approach to assessment. They provide the structure within which an assessor is to look at a student’s learning. I see rubrics as a structuring device that helps an assessor understand what level of proficiency the student has achieved. Different faculty can then use the rubric to evaluate students in a consistent way.

Page 13: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

2016 CAEL Forum & News 10

DY: Rubrics are definitely a quality assurance tool for assessment. They have amazing capacity to respond to some of the biggest hurdles we face in advancing PLA. A lot of comments about the reasons not to do PLA come from a concern about inconsistent practice and weakening the validity of the credit that’s awarded through it. So, the use of rubrics—and the process of constructing them—does a lot to ensure that there’s equity in the way PLA is done. But even in course settings, the rubrics faculty create are there to make sure they hold themselves accountable for consistent and equitable assessment and that students know what the expectations are with enough detail to become active shareholders in their performance. With rubrics, faculty are committing to a discipline in which they are not simply reacting to what a student presents but also thinking in advance about what that credit should represent.

StudentAdvisor

Eportfolio

ePortfolios Question: Kuh talks about ePortfolios and

how they help to bridge and integrate learning experiences, especially for students who may have swirling enrollment patterns. He says, “When well designed and implemented… the ePortfolio helps make learning visible to students themselves, to their peers and faculty, and to external audiences.” Many colleges have used a portfolio approach for PLA for decades—and portfolio assessment research is what originally launched CAEL in 1974—What similarities did you see in the ePortfolio that Kuh describes and the experience that colleges have had with the prior learning portfolio?

DY: Many years ago, the portfolios that were used for PLA were framed in a very similar way. They were intended to be comprehensive and to provide a really multi-dimensional look at what a student’s experience had been and what they took away from that experience. For the most part, schools now define portfolio as a course equivalency submission. So you get a student’s narrative and related artifact that provides their case for why they should earn credit for a particular course or element of the curriculum. That is more efficient for sure, but you don’t get a comprehensive look at what the student has done. There are some institutions that are starting to think about using the technology of e-portfolios but not in ways that will reach the potential for what Kuh is talking about. They’re not seeing the developmental perspective of it as much as they are the efficiency in electronic transmission and archiving. So, it feels to me like PLA is in danger of being behind the learning curve of ePortfolios, especially if institutions still tend to think about portfolios as a transaction rather than a learning tool.

I would hope that the increased use of ePortfolios in the context of instruction that Kuh is talking about changes the practice of PLA to use portfolio in the same way, or at least that they’re cross-purpose portfolios. One of the things we say in our assessor training is that PLA is not just a set of transactions, not just the exchange of paper to make sure that students earn the credit they should earn in order to advance more quickly to completion. That’s a part of what happens, but PLA is an educational process because it can be developmental: it can develop skill, it can develop awareness, it can develop motivation—if it’s done correctly. And the comprehensive nature of ePortfolios really helps with that development.

PT: It really depends on what is required as part of the portfolio. For example, if the college requires that a student’s portfolio include a learning narrative that must incorporate reflection on their learning and integration of their learning experiences, then the learning becomes more visible to the student—they become more self-aware of their own learning. Unfortunately, I don’t think the PLA portfolio is always used in that way.

DY: It’s the student’s ownership of their learning that is critical here. When the student gets feedback on their portfolio that is nuanced and that focuses on making meaning for the sake of their lives beyond

Page 14: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

©The Council for Adult & Experiential Learning, 2016 www.cael.org 11

the classroom, then it really should be theirs. The institution doesn’t own the learning, the student does.

PT: When I talk to younger students who are working on their ePortfolios in their undergraduate career, they often say, “I’m building up almost like a resume, a record of what I have done and what I know so that I can use it when I need to and that includes if I go look for a job.” So Kuh is right that the ePortfolio has potential if done well. But in many cases, it’s definitely seen as a tool for evidence gathering and documentation, rather than a developmental learning tool. It’s similar for the PLA portfolio—when done well, the learning narrative can be an important developmental tool.

Learning Outcomes Question: In Kuh’s conclusion, he says,

“Relatively few faculty members have experience designing clear, explicit course and program outcomes or assignments that directly elicit desired outcomes.” He adds that it may be better for institutions to focus more on the activities that achieve the purpose of assessment. How has CAEL seen this play out with CBE and PLA? Can institutions change the way they think about learning outcomes? And if they do, what will that mean for assessment and for outcome-based strategies for degree completion?

PT: Kuh says in his article that 85% of institutions are working on being explicit about the outcomes of their degree programs. In our experience, though, the focus on outcomes hasn’t quite made it down to the course level. With PLA—and also for CBE—it’s really hard for a student to document learning outcomes if the ones that are written for the course are so vague or brief that the student doesn’t have a complete understanding of the scope of the course. Our LearningCounts experience has found that a good percentage of learning outcomes for existing courses can be problematic in that regard. There is a need to

teach faculty how to get more explicit about learning outcomes so that you’ll know them when you see them. Unclear learning outcomes create problems for students and can leave the judgment of the faculty assessor without any mooring.

DY: I agree. It would be really helpful if faculty were clear about what is meant by the course learning outcomes—and then be willing to hold themselves accountable to those outcomes. I think the importance of learning outcomes for the adult learner may not be fully appreciated by faculty if they spend most of their time in a traditional classroom setting. One of the things I have realized from my experience in PLA training is that course outcomes are seen as set in stone for the purpose of PLA assessment but may be seen as a bit more fluid from the perspective of the faculty instructor. When I myself write a course and when I write course learning outcomes, if I don’t get them quite right at the beginning of the course, I’ve got 10 weeks with students in the classroom to correct it. I can add meaning to the learning outcomes from what I teach, and I can adjust the activities that we engage in together and the assignments I give. I can enrich my learning outcomes when I’m teaching. But when you’re using course learning outcomes as the basis for assessment, you’ve got to get them right because all you’ve got is what you put on the page. So those words have to really reflect what you mean and what you want to hold yourself accountable to as you hold the students accountable for learning.

PT: Assessment is about guiding their thinking and guiding improvement. And when it comes to both PLA and CBE, the assessment must be linked to clear learning outcomes that have to be stated at the start. So attention to learning outcomes, and making sure that they are clear, comprehensive, and measurable, is really important.

REFERENCESBarr, R. B. & Tagg, J. (1995, November/December). From teaching to learning:

A new paradigm for undergraduate education. Change, (27)6, 13–25

Page 15: EXPERIENTIAL LEARNING AND ASSESSMENT FOR TODAY’S …degreeprofile.org/wp-content/uploads/2017/02/Kuh-2016_CFN_Asses… · universities to incorporate data about the quality of the

Linking Learning and Work | www.cael.org

We advocate and innovate on behalf of adult learners to increase access to education and economic security. We provide adults with career guidance and help them earn college credit for what they already know. We equip colleges and universities to attract, retain, and graduate more adult students. We provide employers with smart strategies for employee development. We build workforce organizations’ capacity to connect worker skills to employer demands.©2016 CAEL 55 E Monroe Suite 2710 Chicago, IL 60603 Ph: 312-499-2600 Fax: 312-499-2601 www.cael.org