practical, authentic and sustainable development and ... · practical, authentic and sustainable...
TRANSCRIPT
Paper ID #9806
Practical, authentic and sustainable development and assessment of criticalthinking in engineering through model eliciting activities
Dr. James A. Kaupp, Queen’s University
Jake Kaupp, Ph.D. is an Engineering Education Researcher at Queen’s University, Kingston, Ontario,Canada in the Faculty of Engineering and Applied Science. His primary research interests include:course and program assessment, critical thinking & problem solving development, performance basedassessment, model eliciting activities and data analytics in higher education.
Prof. Brian M Frank, Queen’s University
Brian Frank is an associate professor in Electrical and Computer Engineering, where he has taught coursesin electronics and wireless systems. He is the DuPont Canada Chair in Engineering Education Researchand Development, and the Director of Program Development in the Faculty of Engineering and AppliedScience where he works on engineering curriculum development, program assessment, and developingeducational technology.
c©American Society for Engineering Education, 2014
Practical, authentic and sustainable development and assessment of critical thinking in engineering through model eliciting activities
1. Introduction Higher order skills such as problem solving or critical thinking are key attributes for graduates of any engineering program, are amongst industries highly desired skills for new employees and are considered a hallmark of a university education 1-5. The application of critical thinking helps students solve ill-defined, open-ended, complex problems through the analysis and evaluation of information, evaluating arguments, and developing conclusions resulting from sound reasoning. These complex problems are typical of those encountered in professional engineering practice, and require the reflective, self-regulatory judgment exemplified by critical thinking. While most programs claim to develop critical thinking in some manner, deliberate development and direct assessment of critical thinking using some kind of conceptual framework is less common and quite challenging3,6. This is due to a multitude of factors: the lack of consensus on a definition of critical thinking, debate on whether or not critical thinking skills are generic or domain specific, the large number of available frameworks describing the elements, skills, traits and attributes of critical thinking and the difficulty in assessing a complex cognitive and metacognitive skill. Within engineering, critical thinking instruction is typically provided in through the adoption of a previously established framework. While this method can be successful, there are some concerns regarding alignment between the application of critical thinking skills according to the model and the application of critical thinking skills to solve complex engineering problems. A similar misalignment exists in the assessment of critical thinking skills (CTS), and is of considerable concern. CTS are typically assessed through the use of standardized instruments. These tests are developed according to a guiding framework, which may not reflect the application of critical thinking skills in a manner consistent with how critical thinking skills are applied within the engineering discipline. These tests are typically divorced from course activities, provide little formative information and feedback and are viewed by students as extraneous and disruptive bringing about questions about their practicality and sustainable use in engineering undergraduate programs. Recently, there have been significant efforts in developing valid means for assessing key competencies, such as critical thinking, at the program or institutional level7. While this approach show much promise, these are only assessment materials and do not provide an all-in-one approach for developing and assessing critical thinking in engineering using discipline-specific performance tasks. The primary objective of this paper is to provide a framework for the practical, authentic and sustainable means for simultaneous development and assessment of critical thinking skills in engineering using model-eliciting activities (MEAs). Model eliciting activities (MEAs) are performance-based, realistic problems used in the classroom that require learners to document their solution to problems using mathematical models, and document their processes for solving them. Studies have shown MEAs to be valuable in helping students to develop conceptual understanding, knowledge transfer, and problem-
solving skills8-11. The common principles on which the MEAs are based upon are interwoven with aspects of critical thinking; the assessment and valuation of information; formulating justified assumptions and arguments; generating a valid, defensible model; presenting conclusions and recommendations resulting from analysis; and meta-cognitive reflective self-assessment to test and revise thinking. These elements can be carefully structured into a discipline specific framework for the development critical thinking in engineering as well as organized into a rubric for the assessment of critical thinking skills. Establishing an instructional framework and developing such a rubric will provide instructors with a practical, authentic, rigorous and sustainable means to simultaneously develop and assess CTS that better aligned with educational objectives and course experiences than standardized instruments. In the following sections popular models of critical thinking used in engineering education and the corresponding assessments for each model will be presented along with the critical thinking framework constructed from the common principles of MEAs and the MEA as an assessment instrument. Discussion will pertain to the suitability and alignment of the frameworks and the practicality, accuracy and sustainable of the corresponding assessments, along with the advantages and disadvantages of using MEAs as a method for the consequent development and assessment of students’ critical thinking skills. 2. Applying Critical Thinking in Engineering In order to provide a common point for evaluation, comparison and discussion we will provide our view of the application of critical thinking in the context of solving complex engineering problems. The reflexive, self-regulated, and reflective application of a structured manner of thinking to:
• Identify and accurately describe a problem or issue • Determine the key issues be they technical, environmental or social • Research, analyze and evaluate information pertaining to the problem assessing
credibility, relevance, uncertainty and bias • Evaluate supporting, conflicting and alternate arguments • Develop solutions, conclusions or recommendations supported by data and
analysis • Consider the technical, environmental and social implications of their conclusions
and recommendations
For the purpose of solving ill-defined, open-ended, complex problems 3. Critical Thinking Frameworks & Assessments There are numerous critical thinking frameworks and assessments available for use. For the sake of brevity and the purposes of this paper, three frameworks and companion instruments (Cornell-Illinois Model, CLA Model and Paul-Elder Model) were selected
due to their use within engineering education12-14,11,15,16, accepted validity and reliability of the companion assessment17,18, and the authors previous work. The selected critical thinking frameworks each describe a different viewpoint on the complex construct of critical thinking. Each model is based on a working definition of critical thinking and provides a framework for the component skills, attributes, standards and dispositions according to the working definition. Many of these frameworks do not contain an explicit pedagogical strategy or developmental sequence for students; they simply provide a succinct definition of the construct and its components. However, a definition and framework form the basis of, and are essential to, the infusion of critical thinking into course curriculum. Each framework presented has a companion assessment each constructed their respective definition and framework of critical thinking. This leads to a wide variety in the format and tasks presented in the assessments, each with their own strengths and weaknesses. In the following sections each framework will be presented and companion assessment will be reviewed, discussing general critiques of the assessment, the alignment between the assessment task and complex engineering problems, how suitable the assessment is for use in engineering, and any additional concerns regarding the sustainable use of the assessment. 3a. Cornell-Illinois Model & The Cornell Critical Thinking Test: Level Z The Cornell-Illinois model of critical thinking was developed and refined by Robert Ennis based on the following working definition of critical thinking:
Critical thinking is reasonable and reflective thinking focused on deciding what to believe or do19
The model, illustrated in Figure 1, is divided and sub-classified based on three modes of critical thought (induction, deduction and value judging) and four methods on which they are based: the results of inferences, observations, statements and assumptions. Lastly, the model is connected by a common thread of attention to meaning which is interwoven throughout the four methods and three elements19.
Figure 1. The Cornell-Illinois Model
The companion assessment, the Cornell Critical Thinking Test Level Z (CCTT) is a 52-item, multiple choice test. The CCTT measures aspects of critical thinking consistent with the Cornell-Illinois model organized into 5 categories19:
1) Induction 2) Deduction 3) Observation & Credibility 4) Assumptions 5) Meaning & Fallacies
3b. Critique of the Cornell-Illinois Model & The Cornell Critical Thinking Test: Level Z The Cornell-Illinois framework, presents a vague picture of critical thinking as a set of cognitive skills that are applied to form a course of action. However, the type of thinking required in solving complex engineering problems is not linear in nature, requiring continual assessment, reflection and monitoring. These concerns have been addressed by Ennis in subsequent work20, but raise important concerns about the alignment and suitability of the Cornell-Illinois model for use within engineering. There are some potential issues with using a multiple-choice assessment of CTS, arising from the fact that the test does not assess dispositional aspects of critical thinking, or how individuals chose to engage in critical thinking. Multiple choice CT have been criticized as tests assessing verbal and quantitative knowledge and not critical thinking, since the format prevents test-takers from applying CTS to develop their own solution to the problem21,22. Additionally, multiple choice tests can only narrowly assess a single concept of thought in a question22,23. This is opposed to the real-world application of critical thinking to solve complex engineering problems which an individual employs a wide variety of concepts and skills to provide a comprehensive solution to a complex, interconnected problem encountered in engineering. There is also a significant misalignment between the tasks presented in the CCTT and tasks related in solving a
complex engineering problem, engineering problems will seldom be as simple as selecting the appropriate response out of a list of possibilities. While that may exist at some point in solving engineering problems, it is the result of careful and well-reasoned analysis. 3c. Paul-Elder Model & The International Critical Thinking Test The Paul-Elder model, developed originally by Richard Paul and further refined by both Paul and Elder24. The Paul-Elder model is based on the following working definition of critical thinking as:
that mode of thinking — about any subject, content, or problem — in which the thinker improves the quality of his or her thinking by skillfully analyzing, assessing, and reconstructing it. Critical thinking is self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem-solving abilities, as well as a commitment to overcome our native egocentrism and sociocentrism.24
The Paul-Elder model divides critical thinking into three key components: elements of reasoning, intellectual standards and intellectual traits. The elements of reasoning are universal elements that inform and describe all reasoning or thought. The intellectual standards are standards applied to elements of reasoning or thought to interpret or assess quality. Lastly, the intellectual traits are desired traits or characteristics of a skilled practitioner of critical thinking. These three components are interrelated and each contributes to the development of a critical thinker. In the Paul-Elder model, critical thinkers apply the intellectual standards to the elements of reasoning in order to develop intellectual traits (Figure 2). There are two essential dimensions of thinking that students need to master in order to learn how to upgrade their thinking. They need to be able to identify the component parts of their thinking, and they need to be able to assess their use of these parts of thinking. These two essential dimensions, in concert with the intellectual standards, elements of thought and intellectual traits, can be organized into a rubric for the evaluation of critical thinking.
Figure 2. The Paul-Elder Model
The companion assessment, the International Critical Thinking Test (ICTT) is an essay-style test designed to provide an assessment of the fundamentals of critical thinking. The ICTT has two areas of focus. The first is to provide a reasonable way to measure CTS, while the second is to provide a test instrument that stimulates the faculty to teach their discipline in a manner that fosters critical thinking in the students25. The ICTT is divided into two separate forms: an analysis of a writing prompt and an assessment of the writing prompt. In the analysis segment (Form A) of the test, the student must accurately identify the elements of reasoning within a prompt. In the assessment segment of the test (Form B), the student must critically analyze and evaluate the reasoning used in the original prompt. Student responses are graded according to a rubric based on the elements of reasoning that comprise Paul’s model of critical thinking24:
1) Purpose 2) Questions 3) Information 4) Conclusions 5) Concepts 6) Assumptions 7) Implication 8) Point of view
The ICTT was authored to have high consequential validity, such that the consequence of using the test would be significant and highly visible to instructors26. This encourages discipline-specific adoption of critical thinking and the redevelopment of curriculum that “teach to the test.” 3d. Critique of the Paul-Elder Model & The International Critical Thinking Test The Paul-Elder framework presents a discipline neutral view of critical thinking, and provides a comprehensive cognitive and meta-cognitive view of critical thinking through the standards, elements and traits24. This model is well aligned, suitable framework for use in engineering and has been adapted specifically for engineering27, has been used to form a rubric for the evaluation of critical thinking in engineering12,13,28, and has been used as a framework within MEA instruction11. There are a few potential challenges that may be encountered with this style of test. First, the prompts task students with the recall-based identification and evaluation of the elements of thought. While these skills are of vital importance within critical thinking, the specific prompts cannot evaluate how students apply CTS in a real-world setting 23. This highlights a misalignment between the task presented in the ICTT and what is expected in solving complex engineering problems. While the application of CTS to solve complex engineering problems requires correctly identifying the specific elements involved in critical thinking, the task on the ICTT does not require students to apply these skills in concert to generate a solution or develop a conclusion. Ultimately, the specificity of the questions may limit the breadth of response in test-takers, leading to a reduced inclination to engage in critical thinking29. Lastly, as with any essay-style or rubric evaluated test, inter-rater reliability (IRR) is a potential issue that should be considered when administering the test on a large scale30. 3e. CLA Model & The Collegiate Learning Assessment The description of the CLA model and the Collegiate Learning Assessment are based on the original CLA and subsequent versions used since 2000, and not the newest version the CLA+. The CLA+ offers improvements over the original CLA, addressing many critiques and concerns of the instrument. However, there appears to be little difference between the two versions regarding the manner in which they measure critical thinking. The CLA model was developed for the holistic evaluation of critical thinking through problem solving. The CLA model holds that critical thinking assessment is best approached holistically, arguing that critical thinking cannot be broken down into component parts and measured. Instead, the CLA views the larger construct of critical thinking as being closely connected to and represented by several criteria or skills that students utilize in their responses on the test, as shown in Figure 3.
Figure 3. The CLA Model
The CLA model relies on a criterion sampling approach that is relatively straightforward and seeks to determine the abilities of a student by sampling tasks from the domain in which the student is to be measured, observing their response and inferring performance and learning on the larger construct. Shavelson (2008) explains criterion sampling by using the example of driving a car:
For example, if you want to know whether a person not only knows the laws that govern driving a car but also if she can actually drive a car, don’t just give her a multiple-choice test. Rather, also administer a driving test with a sample of tasks from the general driving domain such as starting the car, pulling into traffic, turning right and left in traffic, backing up, and parking. Based on this sample of performance, it is possible to draw valid inferences about her driving performance more generally.
The CLA follows the criterion sampling approach by presenting students with holistic, real-world problems. Through these problems, it samples tasks and collects students’ responses, which are then graded according to a set of generic skills and formed into rubrics. In order to generate a successful response to the task, students would have to apply problem solving successfully, reason analytically, and write convincingly and effectively. Since these are all underlying components of critical thinking as defined by the CLA model, critical thinking ability can thus be inferred from student responses to test questions. The Collegiate Learning Assessment (CLA) was developed and administered by the Council for Aid to Education (CAE). The CLA is constructed using the CLA model of critical thinking and problem solving as a foundation. Student response are graded using a series of grading rubrics, and are scored automated system on the on the following scales31:
1) Analytic reasoning 2) Problem solving 3) Writing mechanics 4) Writing effectiveness
3f. Critique of CLA Model & The Collegiate Learning Assessment The CLA model is not an explicit framework, unlike the Paul-Elder or Cornell-Illinois models, which reduces critical thinking into constituent parts. Rather, the CLA views critical thinking in the broadest sense, as summarized by1:
The ability to think critically—ask pertinent questions, recognize and define problems, identify arguments on all sides of an issue, search for and use relevant data and arrive in the end at carefully reasoned judgments—is the indispensable means of making effective use of information and knowledge.
This is consistent with the definition of critical thinking as applied to solve complex engineering problems, but lacks a defined structure to be used as an instructional strategy for critical thinking development. The CLA consists of two distinct tasks, of which students generally complete one: a “performance task” and an “analytic writing task” containing two subtasks, “make an argument” and “critique an argument.” There has also been some concern raised about the holistic assessment methods of the test not accurately measuring the component cognitive skills of critical thinking, and some critique on the grading method of the CLA32. There has also been some concern with the CLA results not being suitable for comparison at the individual student level, with testing results suitable only for institutional level measures33. A final concern is that the CLA is typically used to assess longitudinal development CT and is not recommended for measurement across a course experience, which affects the sustainable use of the instrument. Despite these potential challenges, the CLA is a comprehensive assessment, with the tasks requiring the identification, integration and use of multiple skills and critical thinking concepts. The CLA is well aligned with the application of critical thinking skills to solve complex engineering problems. The tasks presented within the CLA are similar in nature to the complex engineering problems. Given a scenario, supporting information of varying pedigree on which to base analysis, provide a well-reasoned solution, conclusion or recommendation. While these tasks may not be fully representative of the scale and complexity of engineering problems, they require the application of the same skills involved in the more comprehensive engineering problems. Overall, any of the reviewed frameworks would be suitable for use in developing CTS in engineering, due to their generic nature. However, explicit instruction in how to apply the elements of the framework towards solving complex engineering problems should be provided. The adoption of the companion assessments is more complex, as there is a distinct difference between the application of CTS in the assessment tasks and the application of CTS to solve complex engineering problems. This misalignment raises
concerns regarding the accuracy and suitability of standardized instruments such as the CCTT and ICTT. The CLA, being a holistic, performance-based assessment of CTS, is well aligned with the application of CTS to solve complex engineering problems. Despite this alignment the CLA does not offer a means for the development of CTS, nor does it provide a suitable means for the sustainable assessment of CTS in a course experience. With this in mind, in order to provide a valid, authentic and sustainable means to simultaneously develop and assess critical thinking within a course experience a realistic, contextually relevant, performance-based intervention, such as MEAs are ideal. 4. Model Eliciting Activities MEAs have been used in engineering education at the university level for the past decade 10,11,34-36. MEAs have shown promising results in developing students’ topical conceptual understanding, information fluency, problem solving and communication skills10. MEAs require students to draw upon prior knowledge and often help to identify and address misconceptions in the course of learning and promote connections between information. There is no explicit framework of thinking skills embedded with the MEAs, leaving the instructor free to carefully adapt and align a selected framework by which to provide scaffolding, structure and guidance for students to develop and apply to the process by which they solve the MEA. The MEAs are designed according to a set of six principles outlined below35,37:
1) Model construction: The activity requires the construction of an explicit description, explanation or procedure for a mathematically significant situation.
2) Reality: Requires the activity to be posed in a realistic engineering context and to be designed so that the students can interpret the activity meaningfully from their different levels of mathematical ability and general knowledge.
3) Self-assessment: The activity contains criteria that students can identify and use to test and revise their current ways of thinking.
4) Model documentation: Students are required to create some form of documentation that will reveal explicitly how they are thinking about the problem situation.
5) Construct share-ability and re-usability: Requires students to produce solutions that are shareable with others and modifiable for other engineering situations.
6) Effective prototype: Ensures that the model produced will be as simple as possible yet still mathematically significant for engineering purposes.
MEA instruction places a considerable emphasis on the process used to solve the problem and the reasoning and thinking students used to develop their solutions rather than on the product of that methodology. The solution of an MEA requires participants to apply and combine multiple engineering, physics or mathematical concepts drawn from their educational experience and previous background to formulate a general mathematical model that can be used to solve the problem. Students typically employ an iterative
process approach to the MEA, first generating a model, testing the model and revising the model to develop a suitable solution38. The students’ solutions to the MEA typically take the form of a comprehensive report outlining the process used to generate their solution to the problem. There have been several studies investigating the impact of MEA instruction on student learning outcomes and general skill development. These studies have shown that MEAs:
1) Encourage a different perspective regarding the use of engineering concepts, with students applying concepts to achieve a broad, high-level solution rather than a low-level formulaic, rote approach10.
2) Encourage students to work collaboratively and cooperatively as a group, honing teamwork and interpersonal skills and delivering a higher quality solution than individual submissions39.
3) Encourage integration and synthesis of information and concepts spanning engineering and other disciplines9.
4) Encourage reasoning and higher-order thinking skills through the ill-structured and complex nature of MEA instruction40.
These benefits lead to a more meaningful learning experience for students by engaging them in an exercise that reflects professional engineering practice. This meaningful learning experience helps foster both higher-level skills and desired outcomes of complex problem solving, communication, information literacy and critical thinking, and provides a developing framework for the assessment of critical thinking by solving complex engineering problems outlined below in Figure 4.
Figure 4. Critical Thinking Assessment Using MEAs
A rubric for the assessment of CTS was developed, guided by the developed framework, our definition of CTS in solving complex engineering problems, the design principles of the MEAs and the results of previous MEA studies. Each outcome of the rubric measures a particular facet of critical thinking skills applied to solve complex engineering problems. Taken together, these form an overall measure for critical thinking skills used in solving engineering problems.
Outcome Description
Information Summary
Accurately summarizes relevant information pertaining to the problem (background, contextual, content and methodological information), and includes an assessment of the credibility, uncertainty and biases of the information and its source.
Solution Generation
Creates, compares and contrasts quantitative models using approximations and assumptions generated from a justified problem solving process supported by information.
Interpreting Results
Evaluates validity of both the model and its results for error and uncertainty, drawing well-supported conclusions to support and strengthen the solution.
Critical Evaluation
Critically assesses conclusions on the basis of intellectual standards of clarity, precision, accuracy, relevance, logicalness, breadth, depth, significance, completeness and fairness.
Argumentation Rationally supports claims and conclusions with data and comprehensive description of the context in which they apply.
Communication Information is clearly and concisely presented, demonstrating consistent use of important engineering and technical reporting conventions, including organization, content, presentation and stylistic choices.
5. Discussion Each of the frameworks for critical thinking presented in the previous sections, were developed on a sound and reasonable definition of critical thinking. Each of these definitions reflects application of critical thinking skills in solving complex engineering problems, however some are considerably more vague than others. Ultimately, the adoption of a critical thinking framework to provide explicit instruction within engineering is suitable, also long as the framework can be adapted or modified for use in an engineering context, and significant instruction is provided in how to apply the aspects critical thinking defined by the framework in a manner consistent with those expected in solving engineering problems. Being able to reliably and validly assess critical thinking is paramount. Using standardized instruments may not be the best way to assess CTS in a course environment for a variety of reasons, including alignment, accuracy, practicality and sustainability. Standardized instruments are typically extraneous from course activities and are viewed as superfluous by students, and can lead to disengagement, motivational issues, and questionable assessment results16. The majority of these instruments must be purchased, which can be a barrier for courses without significant resources. In the case of essay based testing, time and resources for grading, with training and establishing inter-rater reliability included in potential costs22. The use of standardized instruments to assess critical thinking provides scant formative feedback to students, which is essential for the successful instruction and development of thinking skills, such as critical thinking41. These factors severely limit the practicality and sustainability of the use of standardized instruments in for the course based assessment of critical thinking. Maintaining alignment between standardized instruments and instructional objectives is another area of concern. The prompts in standardized instruments are crafted in such a way to assess the dimensions of the framework on which they are based. An assessment structured in this fashion may only serve to measure how well the student applies CTS according to a framework, and does not measure the CTS required to solve complex engineering problems22. This calls into question the accuracy of the test for measuring critical thinking in an engineering context. It is similar to the criterion sampling example presented by Shavelson, that if you want to know if someone understands the rules and mechanics and can also drive a car, don’t give them a multiple choice test, have them perform a task that has them demonstrate the skills you want to assess. In the case of engineering, if we want to assess how engineers apply critical thinking skills to solve complex problems, provide a task that requires them to demonstrate those skills to solve an engineering problem. The authors believe that the MEAs are such a task, and represent promising approach for measuring critical thinking in engineering. The assessment framework and rubric presented earlier presents a practical, course-embedded means for the authentic, rigorous and sustainable measure of critical thinking development and assessing critical thinking skills. The tasks presented in the MEAs are drawn from professional practise and require
students to create and use a mathematical model of a physical system using a numerical computation tool (MATLAB) and to deal with professional issues including ethical dilemmas, conflicting information and incorrect/missing information. While each MEA requires students to employ different areas of subject knowledge, students are taught to approach all three MEAs using critical thinking skills. For example, students are guided to draw concept maps, question the credibility of information sources, incorporate a range of factors into their decision-making and consider the implications of their conclusions. These skills are what Paul calls “elements” of critical thinking – invaluable thinking processes involved in any complex problem-solving activity24. The embedded MEAs are virtually indistinguishable to students, mitigating motivation and engagement issues. Submissions are graded by teaching assistants and course personnel using the rubrics, with training and inter-rater reliability sessions provided, requiring no additional resources. Formative feedback is provided to students with each MEA submission, to help students improve in their application of CTS. These are clear benefits for the use of MEAs for a practical and sustainable means of assessing critical thinking in engineering over standardized instruments. Expectations are clearly presented to the students through the rubric; explicit instruction in the application of critical thinking skills to solve complex engineering problems is provided, alongside formative feedback to assist in CTS development. Like any assessment, the MEAs are far from perfect. The development and integration of MEAs and including critical thinking skills in a course requires a substantial commitment and effort by the course instructor. While sample MEAs are available, these are not applicable for all disciplines, which requires the instructor to develop a MEA of appropriate context and difficulty for their course. The inclusion of critical thinking instruction requires that the instructor be very familiar with the framework and continually reflects upon, evaluates and revises their own thinking to best instruct their own students. MEAs are challenging for students, and providing effective formative feedback in an efficient and timely manner to help future submissions is difficult. As a partner to this, accuracy and reliability in grading is a concern and careful consideration should be paid to establishing and maintaining inter-rater reliability. In conclusion, the authors believe that MEAs provide a platform for the practical, rigorous, authentic and sustainable development and assessment of critical thinking within a course experience. The MEAs provide a real-world engineering scenario in which students can practise the thinking skills that will be required of them by the profession, employers and society and simultaneously provide stakeholders an accurate and authentic measure of student performance. Future work includes establishing rubric validity and reliability, further developing the MEAs and the assessment framework, and investigating the use of alternate frameworks for critical thinking instruction. Lastly, it should be noted that while this approach is an improvement over standardized testing at the course level, program and institution-level assessment can still benefit from the use of standardized instruments or other approaches for the generic assessment of higher order skills.
Bibliography 1. D. Bok, Our Underachieving Colleges. Princeton University Press, 2006. 2. D. Kuhn, “A Developmental Model of Critical Thinking,” Educational Researcher, vol. 28, no.
2, pp. 16–46, Mar. 1999. 3. R. Arum and J. Roksa, Academically Adrift: Limited Learning on College Campuses. Chicago,
IL: University of Chicago Press, 2011. 4. Hart Research Associates, It Takes More Than a Major: Employer Priorities for College
Learning and Student Success. Washington, DC: American Association of Colleges and Universities and Hart Research Associates, 2013.
5. M. S. Roth, “Beyond critical thinking,” The Chronicle of Higher Education, 2010. 6. R. W. Paul, L. Elder, and T. Bartell, “California Teacher Preparation for Instruction in Critical
Thinking: Research Findings and Policy Recommendations.,” 1997. 7. A. P. Finley, “How Reliable Are the VALUE Rubrics?,” Peer Review, vol. 13, no. 4, 2012. 8. L. J. Shuman, “AC 2012-3847: CCLI: MODEL ELICITING ACTIVITIES,” presented at the
Proceedings of the ASEE Annual Conference, 2012. 9. T. P. Yildirim, L. Shuman, M. Besterfield-Sacre, and T. Yildirim, “Model eliciting activities:
assessing engineering student problem solving and skill integration processes,” International Journal of Engineering Education, vol. 26, no. 4, pp. 831–845, 2010.
10. L. J. Shuman and M. Besterfield-Sacre, “The model eliciting activity (MEA) construct: moving engineering education research into the classroom,” presented at the 9th Biennial ASME Conference on Engineering Systems Design and Analysis, Haifa, Israel, 2008.
11. J. A. Kaupp and B. Frank, “Investigating the Impact of Model Eliciting Activities on Development of Critical Thinking,” presented at the 120th ASEE Annual Conference & Exposition, Atlanta, 2013, pp. 1–22.
12. P. A. Ralston and C. L. Bays, “Refining a Critical Thinking Rubric for Engineering,” presented at the Proceedings of the ASEE Annual Conference and Exposition, 2010, pp. 1–16.
13. P. A. Ralston, A. E. Larson, C. L. Bays, Philosophy Documentation Center, “An Assessment of Undergraduate Engineering Students’ Critical Thinking Skills Guided by the Paul-Elder Critical Thinking Framework,” Inquiry: Critical Thinking Across the Disciplines, vol. 26, no. 3, pp. 25–32, 2011.
14. R. H. Ennis and E. E. Weir, “The Ennis-Weir Critical Thinking Essay Test: An Instrument for Teaching and Testing,” 1985.
15. I. D. Clark and K. Norrie, “Research and Reluctance in Improving Canadian Higher Education,” 2012.
16. J. A. Kaupp, B. Frank, and A. Chen, “Investigating the Impact of Model Eliciting Activities on Development of Critical Thinking,” presented at the Proceedings of the Canadian Engineering Education Association, Montreal, 2013, pp. 1–7.
17. C. L. Frisby, “Construct Validity and Psychometric Properties of the Cornell Critical Thinking Test (Level Z): a Contrasted Groups Analysis,” Psychological Reports, 1992.
18. R. Benjamin and M. Chun, “A New Field of Dreams: The Collegiate Learning Assessment Project.,” Peer Review, vol. 5, no. 4, pp. 26–29, 2003.
19. R. H. Ennis, J. Millman, and T. N. Tomko, “Cornell Critical Thinking Tests Level X & Level Z: Manual,” 1985.
20. R. H. Ennis, “Critical thinking assessment,” Theory into practice, 1993. 21. P. C. Abrami, R. M. Bernard, E. Borokhovski, A. Wade, M. A. Surkes, R. Tamim, and D. Zhang,
“Instructional Interventions Affecting Critical Thinking Skills and Dispositions: A Stage 1 Meta-Analysis,” REVIEW OF EDUCATIONAL RESEARCH, vol. 78, no. 4, pp. 1102–1134, Dec. 2008.
22. K. Ku, “Assessing students' critical thinking performance: Urging for measurements using multi-response format,” Thinking Skills and Creativity, vol. 4, no. 1, pp. 70–76, 2009.
23. D. A. Bensley and M. P. Murtagh, “Guidelines for a Scientific Approach to Critical Thinking Assessment,” Teaching of Psychology, vol. 39, no. 1, pp. 5–16, Jan. 2012.
24. R. Paul and L. Elder, A Guide for Educators to Critical Thinking Competency Standards: Standards, Principles, Performance Indicators, and Outcomes with a Critical Thinking Master Rubric, vol. 8. Foundation Critical Thinking (www.criticalthinking.org), 2006.
25. R. Paul and L. Elder, International Critical Thinking Test. Foundation for Critical Thinking (www.criticalthinking.org), 2010.
26. R. Paul and L. Elder, “Consequential validity: using assessment to drive instruction,” Foundation for Critical Thinking (www.criticalthinking.org), 2007.
27. R. J. Niewoehner, “Critical Thinking in the Engineering Enterprise,” PE Magazine, no. November, pp. 16–17, 2008.
28. P. A. Ralston and C. L. Bays, “Enhancing Critical Thinking Across The Undergraduate Experience: An Exemplar From Engineering,” American Journal of Engineering Education (AJEE), vol. 4, no. 2, pp. 119–126, Jan. 2014.
29. K. T. Taube, “Critical thinking ability and disposition as factors of performance on a written critical thinking test,” The Journal of General Education, vol. 46, no. 2, pp. 129–164, 1997.
30. R. J. Shavelson, G. P. Baxter, and X. Gao, “Sampling variability of performance assessments,” J Educational Measurement, vol. 30, no. 3, pp. 215–232, 1993.
31. R. J. Shavelson, “The collegiate learning assessment,” Ford Policy Forum, 2008. 32. K. Possin, “A Serious Flaw in the Collegiate Learning Assessment CLA. Test,” The Critical
Thinking Lab, Winona, MN, 2013. 33. S. Klein, O. L. Liu, and J. Sconing, “Test Validity Study (TVS) Report,” 2009. 34. H. A. Diefes-Dux, T. Moore, J. Zawojewski, P. K. Imbrie, and D. Follman, “A framework for
posing open-ended engineering problems: model-eliciting activities,” presented at the Frontiers in Education, 2004. FIE 2004. 34th Annual, 2004, pp. –460.
35. T. Moore and H. Diefes-Dux, “Developing model-eliciting activities for undergraduate students based on advanced engineering content,” FIE, 2004.
36. B. Frank and J. A. Kaupp, “Evaluating Integrative Model Eliciting Activities in First Year Engineering,” presented at the Proceedings of the Canadian Engineering Education Association, WInnipeg, MN, 2012.
37. R. Lesh and H. M. Doerr, “Symbolizing, communicating, and mathematizing: Key components of models and modeling,” Symbolizing and communicating in …, 2000.
38. R. Lesh and H. M. Doerr, “Foundations of a model and modeling perspective on mathematics teaching, learning, and problem solving,” In Beyond Constructivism: Models and Modeling Perspectives on Mathematics Problem Solving, Learning, and Teaching (May 2003), pp. 3-33, pp. 3–33, 2003.
39. A. A. Gokhale, “Collaborative Learning Enhances Critical Thinking,” Journal of Technology Education, vol. 7, no. 1, 1995.
40. S. A. Chamberlin, “Analysis of interest during and after model eliciting activities: A comparison of gifted and general population students,” 2002.
41. S. Bailin, R. Case, J. R. Coombs, and L. B. Daniels, “Common misconceptions of critical thinking,” Journal of Curriculum Studies, vol. 31, no. 3, pp. 269–283, May 1999.
Paper ID #9382
A thematic analysis on critical thinking in engineering undergraduates
Miss Amy Elizabeth Bumbaco, University of Florida
Amy Bumbaco is a PhD candidate in the Materials Science and Engineering Department at Universityof Florida, USA. She is working on engineering education research as her focus. Her current researchinterests include first year engineering education, critical thinking, qualitative methodologies, and peerreview. She received her BS in Materials Science and Engineering at Virginia Tech. She founded anASEE student chapter at University of Florida and is currently an officer of the chapter and continuessharing engineering education research with fellow members.
Dr. Elliot P. Douglas, University of Florida
Elliot P. Douglas is Associate Professor of Materials Science and Engineering, Dean’s Fellow for Engi-neering Education, and Distinguished Teaching Scholar at the University of Florida. He conducts researchin the areas of engineering problem-solving, critical thinking, active learning, and qualitative methodolo-gies.
c©American Society for Engineering Education, 2014
A thematic analysis on critical thinking in engineering undergraduates
Abstract
This qualitative research paper examines the meaning and enactment of critical thinking for
engineering undergraduate students. Though critical thinking is considered an important topic in
the engineering community, research on the topic is limited to mostly measuring critical thinking
in the classroom and definitions used are not empirically based. Thus, in this paper we seek to
provide an initial exploration of what critical thinking is in engineering. We address the
following research question: How do undergraduate engineering students perceive and enact
critical thinking? Semi-structured interviews were conducted on the enactment of critical
thinking and analyzed using a thematic analysis. Main themes that arose from the interviews
included: difficulty articulating critical thinking ideas, relating critical thinking to engineering
course concepts (especially problem solving), communicating with others, disposition to think
critically, metacognition, challenges of critical thinking in the classroom, and critical thinking
varying in other disciplines and majors. Problem solving concepts prevailed in the many of the
themes. Although themes connected with many of the ideas present in current definitions of
critical thinking, most students did not mention concepts of clarification, credibility,
generalization, or recognizing assumptions. Participants also emphasized a broader idea of
communication and stronger reliance on real world context in critical thinking than previously
established by critical thinking definitions.
Introduction
Academics value the importance of critical thinking in the development of any student.
However, in their book, Academically Adrift, Richard Arum and Josipa Roksa revealed the
notion that critical thinking may not be learned by students in undergraduate programs.1 After
emphasizing how little students gain in the four years of college, Arum and Roksa stated more
generally that: “While [students] may be acquiring subject-specific knowledge or greater self-
awareness on their journeys through college, many students are not improving their skills in
critical thinking, complex reasoning, and writing.” (p. 36) Before their book created a renewed
interest in critical thinking, ABET EAC criteria and the NAE report The Engineer of 2020
created criteria and attributes that focused on what engineering students needed to do for the
future. The NAE outlined the following important attributes: “strong analytical skills, creativity,
ingenuity, professionalism, and leadership.”2,3
These attributes and the EAC criteria do not
identify critical thinking directly; however, the listed skills relate to common ideas of critical
thinking. Due to this drive for improvement, many engineering programs and departments have
begun to incorporate critical thinking into their goals for student outcomes or into their mission
or vision statements.4–11
It is important to study how these goals can be met and what is being
done in higher education to achieve them.
Many definitions of critical thinking exist.12–24
One of the commonly used definitions is that of
the “Delphi Report” which defines critical thinking as “purposeful, self-regulatory judgment
which results in interpretation, analysis, evaluation, and inference, as well as explanation of the
evidential, conceptual, methodological, criteriological, or contextual considerations upon which
that judgment is based.”14,15
Mason’s simplified framework of critical thinking, based on many
of the existing philosophical approaches to critical thinking, includes the following aspects21
:
The skills of critical reasoning (such as the ability to assess reasons properly);
A disposition, in the sense of:
o A critical attitude (scepticism [sic], the tendency to ask probing questions) and
the commitment to give expression to this attitude, or
o A moral orientation which motivates critical thinking;
Substantial knowledge of particular content, whether of:
o Concepts of critical thinking (such as necessary and sufficient conditions), or of
o A particular discipline, in which one is then capable of critical thought. (pp. 343-
344)
Most definitions of critical thinking are not empirically based and they are rarely specific to
engineering. Recently critical thinking experts created guides on critical thinking. Paul and Elder
created a ‘mini’ guide for critical thinking in the classrooms25
and expanded their work to a
guidebook for critical thinking in engineering called Thinker’s Guide to Engineering Reason.26
Following up on Paul et al.26
and the work of the critical thinking foundation, Van Gyn et al.
also created a guideline for critical thinking for engineering.27
These were created based on
common definitions and engineering concepts, but not based on empirical research. Although
these guides allowed operationalization in some recent engineering studies,28–30
critical thinking
research in engineering is generally conducted without a structured definition of critical thinking
and is most commonly used to simply measure critical thinking in the classroom.
The few definitions or structured guides of critical thinking that are empirically based are built
from receiving input from a variety of faculty, including engineering faculty.31–34
However,
research on the meaning or ideas of critical thinking for students is not present in literature.
Critical thinking is often incorporated without gaining student input, especially on how they best
enact critical thinking. How do we know if students are enacting and understanding the critical
thinking that faculty members and departments intend to instill? Are students understanding and
learning the same critical thinking as defined by these guides, definitions, or faculty members?
Do students enact critical thinking only in particular environments or situations? Understanding
students’ perception and enactment may create a foundation for more efficient implementation of
critical thinking in the future. To answer these questions and help students learn more
effectively, gaining student input and understanding student perspectives is necessary.
Thus, in this paper we seek to provide an initial exploration of what critical thinking is in the
engineering classroom. This research paper examines the meaning and enactment of critical
thinking for engineering undergraduate students. We address the following research question:
How do undergraduate engineering students perceive and enact critical thinking?
Methodology
This study is the pilot phase of a larger project aiming to understand critical thinking for students
and faculty in humanities and engineering. Since this is a part of the work of a larger study only
one discipline and 5 students were examined. Further disciplines will be studied as part of the
full study. In this paper, we examine the enactment and meaning of critical thinking for materials
science and engineering students. Students were selected by requesting participants in a required
senior materials science and engineering course and asking students to email back if interested.
The first five students to respond were selected. Semi-structured interviews were conducted with
these five students. These interviews focused on the ways in which students used critical
thinking in their engineering classes and what critical thinking means to them. Interviews were
analyzed using a thematic analysis. Statements in the interview transcriptions were coded with
descriptive labels. These codes were then categorized with similar concepts. When more than
half the students expressed similar concepts, the category became a major theme. This includes
concepts of opposing views. For instance in the theme critical thinking varying in other
disciplines and majors, three students mentioned some aspect of critical thinking in different
disciplines but not all had the same view. One student said critical thinking was the same for
everyone with just a different knowledge base, another student felt there is a difference in
thinking between engineering majors, and a third student had a view somewhat in between these
two. Based on these related or opposite aspects, the general idea was supported by more than two
students and a theme was created. However, if a concept was only mentioned by two students
and no related or opposite ideas mentioned, the concept was not considered a theme at this time.
With further data collection and analysis of other students more themes may arise.
Findings
Many themes appeared in the data. The main types of themes include the following categories:
difficulty articulating critical thinking ideas, relating critical thinking to engineering course
concepts, communicating with others, disposition to think critically, using metacognition,
challenges of critical thinking in the classroom, and critical thinking varying in other disciplines
and majors.
Difficulty articulating critical thinking ideas
Many of the students expressed confusion and felt unsure when addressing thoughts on critical
thinking. They showed poor ability to articulate how they used or viewed critical thinking.
Interviews included contradicting thoughts and direct statements about not knowing how to
express their views. For example when one student discussed his way of reasoning, he stated,
“Okay. Um, (pause) well, I mean, you have to, well, I mean, I consider the multiple aspects that,
um, are, it’s hard to phrase, let’s see.” Another student struggled to verbalize her thoughts when
discussing teaching critical thinking. “So I think, I think they’re teaching something in a fashion
that makes sense to people. I’m not sure how to explain what I mean. Um. (pause)…”
Relating critical thinking to engineering course concepts
Although students struggled with articulating their thoughts, students did tend to relate ideas of
critical thinking to engineering concepts they deal with in the classroom. These engineering
course concepts include: applying a framework/plan; weighing, selecting, and testing options
(selection and design); using background knowledge; and using problem solving. For instance
one student explained the critical thinking process in a design course as:
There’s a coach but no one tells you what to do or how to solve the problem. You’re
expected to understand the problem, come up with possible solutions, select those
solutions, or select the best couple solutions, test them and you know, at the end of the
year design the products.
Every student directly mentioned needing a knowledge base or background. For example, “And
then start working off your basis of knowledge, you know, what do you know for the material
selection process, you know, what materials just come to mind when you think of things, start
looking into those.”
Many of these course concepts confirm the students’ idea that critical thinking was similar or
equivalent to problem solving. Though students believed in many different styles of problems
and answers, including a right answer and an opened-ended problem with many or no answers,
problem solving in general was mentioned by every student. As one student explained, “Um, I
mean, it seems like I said to me critical thinking is just like, you know, problem solving. It’s
taking everything that you know and applying it to narrow down to your solution.” Problem solving included five sub-concepts: figuring out what the problem is, figuring out why
something is happening, solving in an orderly way, applying to a real world context, and
reaching a conclusion/solution. Students found defining the problem as the first step:
I think like again this is at least in my mind the way I think about it, um, the first thing is
to understand the, the problem or whatever it is that you have observed uh and then when
you’re developing um, the plan on how to fix it you want to address whatever the root
cause is. So that’s why understanding the problem is so important.
Many students mentioned figuring out why something occurred: “So you already have some idea
of where you’re going. You already may know what you’re looking for but something might
shock you or surprise you and then you have to go back and figure out okay, why did this
occur?” Participants also valued applying problems to the real world: “Like because it’s a real
world problem and I think the only way you can train people to solve real world problems is by
giving them real world problems.” Students also shared a similar idea of the final step to critical
thinking, reaching a conclusion or solution. As one student stated, “Well, I would define critical
thinking as the employment of reason in order to reach a conclusion especially in regards to
problem solving.”
Disposition to thinking critically
Beyond what was taught in the classroom, many students saw the necessity of particular
personality traits. Some students mentioned needing curiosity or interest or desire to think
critically. One student mentioned: “Um, like practical hands-on projects because it would get
someone interested in it and you like make them feel like they’re actually doing something to
motivate them to actually think critically.”
Using metacognition
A few students also discussed having the ability to think about their thinking or using self-
checking as an important trait towards critical thinking: “Metacognition, like um, like is, is my
way of thinking correct? Like is this, is this process actually effective and um, will lead to
correct answer or that kind of thing.”
Communicating with others
Students believed communication to be important for practicing and using critical thinking. For
example one student stated: “I mean, I guess it’s always good to explain to someone why you’re
thinking the way you think because like you could just make an informed decision, but if you
can’t tell someone why then that decision probably won’t become a reality.” Not only is
explaining to others important, but discussion with others and learning from them also holds
value to the participants. For example, one student explained this interaction:
So I think yeah, pulling in people from other disciplines, both engineering and even non-
engineering. They might say something that you think why wouldn’t that work but you
just would’ve never thought of it because you’re, you know, your mind is already kind of
going through the steps that you’re used to …
Critical thinking varying in other disciplines and majors
Learning from others was often achieved by interacting with students from different disciplines
and majors. A few students believed other engineering disciplines and non-engineering majors to
have differences in the process of thinking. For example:
Well, I think that engineers in general when they’re all grouped together, think in a very
mechanical way, I mean mechanical engineering, a pun intended I guess. But um, but you
know you think of okay what equation can I use or what mechanism should I apply to
this. Or um, you know, things like that. Um, but I think when you start to get into the
things that people don’t know as much about which is materials in general, then you start
to run into, okay, let’s think about this more in a simplistic way, or let’s think about this
more in a scientific way, but for a lot of engineering disciplines, it’s more of, you know,
the answer’s out there and we just have to figure it out.
However, others believed that critical thinking was the same for everyone with just a different
knowledge base. As mentioned by one student:
I don’t, I don’t know if there are necessarily critical, you know, critical thinking or
thinking differences between the two skill sets. Like I think the skills to answer you know,
like a psychology exam and answer an engineering exam, not like knowledge-wise but
like how to do it are maybe the same. It’s just that people’s background and you know,
influences them to see them differently.
Whether the students saw that critical thinking was the same or different between majors, many
agreed with the previous quote that background had an influence on how a student thinks.
Challenges of critical thinking in the classroom
The students also expressed views on the way critical thinking should be taught, indicating that
critical thinking seems to be challenging for their instructors to teach but that it is important,
especially for ‘real world’ work after academia.
They believed that the way to teach critical thinking was through engagement: “Um, well, like
for example, this is a really simple one but if they have a, ah, they’re teaching and they, they
kind of ask the question like okay, so what comes next …” Also this can be done through faculty
guidance or coaching as one student explained:
Exactly, exactly. And so it’s important to kind of help them along. I think they—I guess
what I’m trying to say is um, if you give someone up there a mechanical problem they’ll
solve it. They may gain very little, not lose really anything so it’s kind of a net zero loss.
Um, if you give someone a really abstract thing and they just kind of shut down then
that’s a loss and if you give them something abstract but they actually kind of work at it
and you help them along then that’s a gain.
However participants found that explicit teaching and assessing of critical thinking was
challenging in the classroom:
…um, so it’s, I think [implementation of critical thinking is] pretty rare because
assignments, it seems like they’re hard to design with like critical thinking embedded in
them. It’s definitely, like even just using the textbooks like most textbook questions are
just the, more of the equation type. Some are not so much. I’ve seen some that aren’t but
ah, typically, I think that critical thinking questions like the ones that really make you
think, ah, I mean, they tend to take longer.
Conclusions
As stated, students had difficulty expressing their views on critical thinking. Ahern et al.
similarly found that engineering faculty had difficulty with articulating critical thinking.34
Ahern
et al. also indicated that critical thinking is not explicitly discussed or addressed in engineering
classes by these faculty.34
This fact that faculty do not, or possibly cannot, make critical thinking
explicit may explain why the students in this study experienced difficulty expressing their
thoughts. Lack of direct exposure and direction by faculty may hinder students’ ability to
articulate and understand their own ideas on critical thinking.
Students coped with this gap by relating critical thinking to engineering concepts, including
using background knowledge and resources, criteria selection, and the engineering process of
approaching problems. These concepts, though usually based around class context and exercises,
connect to some general ideas of critical thinking including: identifying problems, comparing
ideas, evaluating, discovering alternatives, drawing a conclusions, supporting with relevant and
adequate evidence, and involving content knowledge.14,15,20–22,24,31,33,35
It is important to mention
that many of these critical thinking concepts from the participants and the related concepts from
literature connect to problem solving as defined in the literature. Problem solving based on
Woods, for instance, includes engage: I want to and I can, define-the-stated problem, explore,
plan, do it, and look back.36
There are criteria beyond engineering concepts and problem solving useful for critical thinking.
Non-engineering concepts that exist in the literature that were confirmed by the students
included metacognition,12
usually indirectly mentioned in the literature as part of the ability to
reason, and disposition.16,24,37
However, though present for a majority of the participants, these
concepts were not unanimously mentioned. Also Mason’s idea of a moral disposition was much
less apparent, not appearing as a theme from these interviews.21
Communication with others also helped them solidify how they learned to think and broadened
their abilities to approach problems. The forms of communication, learning from others and
explaining to others, are not often included in formal definitions of critical thinking. The Delphi
report does include the idea of explanation and some other experts mention argumentation or
presentation.15,20,22,24,31
Willingham’s definition includes argumentation mentioning “seeing both
sides of an issue” and “demanding backing of claims.”13
However, argumentation and
presentation are not necessarily equivalent to discussion or learning from others.
As discussed, many main themes connected to the critical thinking concepts in literature;
however, some concepts were not prevalent in the data. Missing critical thinking concepts
include: clarification, credibility, and recognizing assumptions. One or two participants
discussed these ideas but not a majority of the participants. Most students did not discuss looking
into the credibility of sources or recognizing assumptions during their problem solving process
or design project work.
Participants highly valued certain concepts beyond those previously discussed in literature. From
the students’ perspective, faculty teaching through a real world context and coaching the students
promoted learning critical thinking. Existing literature does not emphasize these concepts when
discussing the critical thinking process. Critical thinking in academia and in practice is valuable
to these students, but in order to learn and understand critical thinking better, they need
engagement from faculty and an emphasis on critical thinking from faculty and in the
curriculum. Connecting critical thinking more explicitly to engineering concepts may help future
students better understand the meaning of critical thinking and learn it more effectively. Future
work will expand upon this study to include a grounded theory on critical thinking in engineering
for both students and faculty. This pilot study was limited to one discipline; however, the future
work will include interviews with students in other engineering disciplines and with the
respective discipline’s faculty members.
Bibliography
1. Arum, R. & Roksa, J. Academically Adrift: Limited Learning on College Campuses. (University of Chicago
Press, 2011).
2. National Academy of Engineering. Educating the engineer of 2020: adapting engineering education to the new
century. (National Academies Press, 2005).
3. National Academy of Engineering. The engineer of 2020: visions of engineering in the new century. (National
Academies Press, 2004).
4. Mission, Vision, and Values | Industrial and Systems Engineering | Virginia Tech. at
<http://www.ise.vt.edu/About/MissionVisionValues/MissionVisionValues.html>
5. Mission Statement | Harvard University. at <http://www.harvard.edu/faqs/mission-statement>
6. Calvin College. at <http://www.calvin.edu/academic/engineering/about/mission.html>
7. Mission Statements | Michigan Engineering. at
<http://www.engin.umich.edu/college/academics/bulletin/depts/cee/mission>
8. cms. Mission Statement — UCLA Mechanical and Aerospace Engineering. at
<http://www.mae.ucla.edu/about>
9. Mission & Vision Statement | Department of Electrical & Computer Engineering | Daniel Felix Ritchie School
of Engineering & Computer Science | University of Denver. at
<http://www.du.edu/rsecs/departments/ece/missionvision.html>
10. Murray State University > Engineering and Physics Department Mission Statement. at
<http://www.murraystate.edu/academics/CollegesDepartments/CollegeOfScienceEngineeringandTechnology/C
ollegeOfSciencePrograms/EngineeringPhysics/EPHYmission.aspx>
11. CEE Department Mission Statements | The Charles E. Via, Jr. Department of Civil and Environmental
Engineering | Virginia Tech. at <http://www.cee.vt.edu/academics/academics_mission_statements.html>
12. French, J. N. & Rhoder, C. Teaching Thinking Skills: Theory and Practice. New York Garland Pub. (Inc,
1992).
13. Willingham, D. T. Critical Thinking: Why Is It So Hard to Teach? Arts Educ. Policy Rev. 109, 21–32 (2008).
14. Facione, P. A. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and
Instruction. Research Findings and Recommendations. (1990). at
<http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED315423>
15. Facione, P. A. Critical thinking: What it is and why it counts. Millbrae CA Calif. Acad. Press Retrieved April 1,
2004 (2011).
16. Norris, S. P. Synthesis of research on critical thinking. Educ. Leadersh. 42, 40–45 (1985).
17. Norris, S. P. The generalizability of critical thinking: Multiple perspectives on an educational ideal. (Teachers
College Press, 1992).
18. Yinger, R. J. Can we really teach them to think? New Dir. Teach. Learn. 1980, 11–31 (1980).
19. Paul, R. W. Critical Thinking: Fundamental to Education for a Free Society. Educ. Leadersh. 42, n1 (1984).
20. Walsh, D. & Paul, R. W. The Goal of Critical Thinking: from Educational Ideal to Educational Reality. (1986).
at <http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED295916>
21. Mason, M. Critical thinking and learning. Educ. Philos. Theory 39, 339–349 (2007).
22. Ennis, R. H. A taxonomy of critical thinking dispositions and abilities. (1987). at
<http://psycnet.apa.org/psycinfo/1986-98688-001>
23. Watson, G. B. & Glaser, E. M. Watson-Glaser Critical Thinking Appraisal: Manual. (Psychological
Corporation, 1980).
24. Beyer, B. K. Practical strategies for the teaching of thinking. (ERIC, 1987). at
<http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED288824>
25. Paul, R. & Elder, L. The Miniature Guide to Critical Thinking-Concepts and Tools. 2, (Foundation Critical
Thinking, 2001).
26. Paul, R., Niewoehner, R. & Elder, L. The thinker’s guide to engineering reasoning. (Foundation Critical
Thinking, 2006). at
<http://books.google.com/books?hl=en&lr=&id=AyoaIIIypxMC&oi=fnd&pg=PT5&dq=The+Thinker%27s+G
uide+to+Engineering+Reasoning+&ots=dYZ8Gh_Yac&sig=4kWzoM1IXUnqdrCZQBCLQwsUyC4>
27. Van Gyn, G., Ford, C. & Society for Teaching and Learning in Higher Education. Teaching for critical
thinking. (Society for Teaching and Learning in Higher Education = Société pour l’avancement de la pédagogie
dans l’enseignement supérieur, 2006).
28. Lewis, J. E. & Bays, C. Undergraduate Engineering Students and Critical Thinking: A Preliminary Analysis. in
Am. Soc. Eng. Educ. 2011 Annu. Conf. Expo. (2011).
29. Lewis, J. E., Hieb, J. & Wheatley, D. Introducing Critical Thinking to Freshman Engineering Students. in 2010
Annu. Conf. Expo. (2010). at
<http://search.asee.org/search/fetch?url=file%3A%2F%2Flocalhost%2FE%3A%2Fsearch%2Fconference%2F3
2%2FAC%25202010Full1278.pdf&index=conference_papers&space=129746797203605791716676178&type
=application%2Fpdf&charset=>
30. Romkey, L. The development and assessment of critical thinking for the global engineer. in Proc. 2009 Am.
Soc. Eng. Educ. Conf. (2009).
31. Stein, B., Haynes, A., Redding, M., Ennis, T. & Cecil, M. Assessing critical thinking in STEM and beyond.
Innov. E-Learn. Instr. Technol. Assess. Eng. Educ. 79–82 (2007).
32. Stein, B. et al. Faculty Driven Assessment of Critical Thinking: National Dissemination of the CAT Instrument.
Technol. Dev. Netw. Educ. Autom. 55–58 (2010).
33. Jegede, O. J. & Noordink, P. The Role of Critical Thinking Skills in Undergraduate Study as Perceived by
University Teachers across Academic Disciplines. (1993). at
<http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=ED362122>
34. Ahern, A., O’Connor, T., McRuairc, G., McNamara, M. & O’Donnell, D. Critical thinking in the university
curriculum – the impact on engineering education. Eur. J. Eng. Educ. 37, 125–132 (2012).
35. Mayhew, L. & Dressel, P. General Education Explorations of Evaluation. (American Council On Education,
1954).
36. Woods, D. R. An Evidence-Based Strategy for Problem Solving. J. Eng. Educ. 89, 443–459 (2000).
37. Facione, P. A. Critical thinking: What it is and why it counts. Millbrae CA Calif. Acad. Press Retrieved Febr.
26, 2007 (1998).
Paper ID #8586
Advanced Student-Centric Learning Practices in Applied Engineering Pro-grams
Prof. Ben D Radhakrishnan, National University
Prof. Ben Radhakrishnan is currently a full time Faculty in the School of Engineering, Technology andMedia (SETM), National University, San Diego, California, USA. He is the Lead Faculty for MS Sus-tainability Management Program. He develops and teaches Engineering Management and SustainabilityManagement graduate level courses. Ben has taught Sustainability workshops in Los Angeles (Army) andSan Diego (SDGE). His special interests and research include teaching methods (specifically Student-Centric Learning), promoting Leadership in Sustainability and Management Practices. He is also anAffiliate Researcher at Lawrence Berkeley National Laboratory, Berkeley, CA, focusing on the energy ef-ficiency of IT Equipment in a Data Centers. Before his teaching career, he had a very successful corporatemanagement career working in R&D at Lucent Technologies and as the Director of Global TechnologyManagement at Qualcomm. He initiated and managed software development for both the companies inIndia. He holds MS in Engineering and MBA degrees.
c©American Society for Engineering Education, 2014
1
Advanced Student-Centric Learning Practices in Applied Engineering
Programs
2
Abstract
Student-Centric Learning (SCL) has been recognized as an effective methodology to engage and
motivate students for some time now. Research conducted by National Training Laboratory and
other leading universities have identified several benefits of SCL including deeper learning,
motivation, teamwork, etc. With the advent of high-speed internet technology, availability of large
data bases brimming with information and the rapid growth of online course offerings, SCL has
been taken to another new higher level, thus empowering students like never before.
Different types of SCL techniques have been successfully implemented in MS Sustainability
Management, MS Engineering Management and MS Environmental Engineering programs.
Graduate level applied engineering programs are offered through onsite live classes and online
classes. The specific SCL techniques used include:
- Knowledge-centric SCL promoting development of critical thinking by applying learned
outcomes to real world problem-solving
- Learner-centric SCL promoting students to be more creative and use of prior knowledge
- Assessment-centric SCL promoting opportunities for feedback and improvement
- Community-centric SCL promoting interactions among the learners for sustained learning
This paper presents specific examples of the advanced SCL techniques/exercises used in different
online or onsite courses, along with associated assessments. Charts with students’ feedback on the
SCL practices will also be presented. Students’ feedback on their learning experiences through
SCL has been positive. The paper will also discuss the future of SCL, the opportunities it provides
for student empowerment and the rapid change from passive to active learning with technology as
a key enabler.
Introduction
The National Research Council (NRC) in its 2012 report to congress identifies the need to instill
twenty first century knowledge and skills - problem solving, critical thinking and communication
- in order to navigate the rapidly changing world, and these skills should promote “deeper
learning” [1]
. Deeper learning helps the students to better master the subject matter by going
beyond the routine learning and thus helps them to develop knowledge and skills to solve
problems in today’s workplace. Students who are armed with these tools would have a competitive
edge.
Student-Centric Learning (SCL) practices are tools that promote students’ deeper learning,
empowering and engaging students (cf. T4SCL Report by European Student’s Union and
Educational International, 2010 [12]
). Universities have a unique responsibility to teach twenty first
century skills which are immediately applicable in work places. Bloomberg [13]
in her many
publications and research has written and trained faculty for ‘Learner-centered teaching’, resulting
3
in SCL. Two key traits of SCL - empowerment and engagement - put the learning and teaching of
these skills on center stage.
Technology with its vast advancement and capabilities (e.g. Internet, hi-tech learning tools), has
become a key enabler in today’s teaching and learning environment. Technology has helped both
the students and the instructors. Learning environment has gone from a class of tens of students to
hundreds in ‘onsite live’ classes while the online learning environment with Massive Open Online
Courses (MOOCs) are registering tens of thousands of students. An online class, in addition to
posing many challenges to instructors and the students, by design provides a SCL environment.
SCL learning and teaching practices have been implemented at graduate level programs in the
Applied Engineering department. This paper will discuss several of the practices used in three
programs (MS Sustainability Management, MS Engineering Environment and MS Engineering
Management). Student feedback from several courses will also be presented and discussed. These
SCL techniques were practiced both in the onsite (live) classes as well as online classes. This
paper will also discuss some of the faculty questions being raised regarding SCL and on the future
impacts of technology.
Student Centric Learning Practices Background
Literature survey credits the concept of SCL to Hayward and the writings of Dewey (1956), and
more recognition for this methodology came during the 80’s and 90’s [2]
. Early discussions were
focused on the shifting of power from the teacher to the student: empowering the students, expand
and encourage interaction among students and changing the major information flow away from
one-to-many (old traditional instruction). In another well-known research by Craik and Lockhart,
it was proven that learning and retention are related to the depth of mental processing [3]
. The
practices and techniques of SCL engage students in a very active manner where continuous mental
processing is required, thus leading to higher retention of the subject matter.
The Council on Science and Technology at Princeton University has identified several methods of
Student-Centered Teaching methods [4]
(also referred to as Student Centric Instruction, SCI).
These methods range from small group discussions to case studies to computer simulations and
games (or gamification: process of learning through games; referring to the design/creation, play
and demonstrating a game in support of course learning outcomes). The objective of each of these
practices or techniques is to get the student to engage, to participate making the learning ‘active’
and not ‘passive’ (referring to the traditional lecture with practically no interaction with the
instructor or among the students). In other descriptions, it is also stated that when SCL is properly
implemented, it can lead to increased motivation to learn, greater retention of knowledge, deeper
understanding, and more positive attitudes towards the subject being taught [5]
.
4
One of the key reports released by National Research Council recommended that learning
environments can be organized in to different categories or focus areas, namely, knowledge-
centered, learner-centered, assessment-centered, and community-centered [6]
. Knowledge-centered
learning mainly refers to students ability to transfer their learning to critical thinking and problem-
solving skills; learner-centric approach refers to the ideas and information the students bring on
their own from their prior learning or other experiences; assessment learning approach gives the
students quick feedback on how the students meet the course learning outcomes, and to help
improve (or turn around); community-learning refers to the opportunities that help promote
interactions among the students and thus learn from each other. This also gives each student
opportunity to become a teacher under this category, fostering communication and teamwork.
The different SCL practices implemented and discussed in the paper can be identified with one of
the above four categories. The type of environment and exercises provided in different classes
were to help students to fully participate and be engaged - in other words, for students to be
engaged in active mental processing. Table 1 refers to some of the key differences between
Instructor-centric (passive learning) and student-centric (active learning). It is clear from the active
learning process or SCL, there is a high degree of mental processing involved – both individually
and through interactions.
Passive Learning
Process Active Learning Process
(Student-Centric)
Focus is on Instructor Focus is both students and Instructor. High interaction
between students and the instructor
Students Work Alone Students work in teams or alone depending on the purpose of
the activity
Instructor answers
students’ questions, if
any
Students actively participate in class discussions and
instructor facilitates; instructor initiates a class dialogue with
an opened ended topic or a question
Instructor chooses
topics Students choose or initiate topics for discussion
Table 1 – Summary of Differences between Passive and Active Learning Processes (SCL)
Another widely referenced popular learning pyramid published by the National Training
Laboratories, assigns various average learning retention rate for the different types of learning
methods or practices – see Figure 2 [7]
. It is clear from the different methods, retention rates are
much higher with active learning using SCL type practices.
5
Figure 2 – Retention Rates and Learning Methods [7]
SCL in Technical Education
Literature is full of research for SCL practices implemented for teaching subjects – from
languages to science to engineering (or technical) education. At our university, SCL practices have
been introduced in two departments – Computer Science and Applied Engineering. This paper will
cover the details of advanced SCL practices in the Applied Engineering department in the
following graduate programs – MS Engineering Management (Online and Onsite), MS
Sustainability Management (Online and Onsite) and MS Environmental Engineering (Onsite).
Although each of the above programs addresses different majors, the SCL techniques or practices
were similar and to some extent tailored to the courses in each program. The general theme being:
facilitate high levels of student engagement and assess them on their engagement (feedback).
Since technical education deals with more ‘quantifiable’ subject matter, SCL practices fits really
well for engineering courses.
Implementation of Advanced Practices of SCL
The different SCL practices implemented will be framed within the four categories or areas as
noted earlier. Specific practices implemented for each category will be discussed. It is to be noted
that the implementation of these practices are limited to the three specific programs in our school
and are still in their initial stages at the graduate level – not implemented in all courses. Different
SCL activities will be discussed first followed by students’ feedback.
Knowledge-Centric SCL
The main learning base here is to focus on developing critical thinking and data synthesis. Two
practices have been implemented for this SCL – ‘News of the Day’ and ‘Debates’.
6
For ‘News of the Day’ - onsite students in the different courses participate in this activity. The
objective is to identify a recent development in the news media or an event that occurred within
the last 30-45 days, from any relevant and reputed magazine or newspaper or journal. The news
item or event should be directly related to and meets one or more course objectives, and an URL
link is available. Each student will post the URL link in to the course home-page (Wibliography)
and talk to the topic for 5 minutes followed by a brief Q&A session. The activity has a certain
grade allocation associated with it for class participation. Each student will identify the source and
event (or news), location, how it is tied to the current course and its importance. The student will
also discuss quantification as applicable to the subject matter being taught. Students are also
expected to interpret the news or event as they see it and give their opinion (positive or negative).
As noted, this is an exercise in critical thinking and data synthesis (students can discuss other
related events). The instructor publishes a schedule for all students. The News of the Day
discussion is at the beginning of each class (two or three students present in each class). This
approach engages the students from the beginning of the class to be active, attentive and engage.
As an example – in the Engineering Management Concepts course, the students will look for a
management related news or event and present it. They could be on human resources management,
engineering management or engineering ethics, etc. The student opens the URL for the class and
starts the discussion. In the Sustainability courses, students will bring news items (or events)
directly relating to the sustainability – energy, water, environment, policy, equity or economics. In
the Globalization class, with so much going on the world today students bring latest topics on
global events (as it impacts economy, culture, trade, environment, etc.). This SCL practice gets
away from the expectation that the instructor comes in and start a lecture and all students have to
do is be passive and listen. In the News of the Day SCL activity, students get engaged and active
right from the beginning of the class.
For ‘Debate’ SCL activity - this is implemented in the Globalization class where teams (or
individuals) take a position whether globalization is ‘good’ or ‘bad’ (this is a huge topic in today’s
world), and present their case with a current example and quantification to the class followed by a
brief Q&A session. Again, students engage quickly and add to the discussions and refer to events
from their own companies.
Although text books these days come up with new editions more often than before, this SCL
activity keeps the students informed about the latest happenings in the subject they are learning.
Students’ feedback comments reflect that they are anxious to hear about the latest developments as
they come to each class – they have a different expectation at the beginning of each class.
Learner-Centric SCL
The main learning base here is focused on learners becoming ‘creators’ with their own ideas
which advances creativity/innovation and uses any and all prior knowledge the learners might
bring to the table. This is a team activity and involves the student-teams designing, creating,
7
playing and demonstrating a game that is relevant to the subject matter and supports course
learning outcomes. In this SCL practice the course team project will be quantified and
demonstrated through a game that the team will design, create, play and demonstrate. Teams are
given full freedom (empowered) to create/innovate and even to adopt any existing game to suit the
project under study. This SCL has been successfully implemented in more than one course in the
MS Sustainability Management program. This particular SCL is perhaps one of the most advanced
SCL used in the graduate classes since real world problems need to be well understood to be able
to bring the concepts down to game objective(s), rules and winning strategy (games have been
used in school education and is well documented). Sustainability topics are somewhat new in
higher education and the concepts (such as Equity) are difficult concepts for students to
understand. But making this practical with gamification brings home the concepts when they are
quantified. Deep learning happens. Many students bring their prior gaming experiences to bear for
the gamfication process, and help others who may not have that experience by teaching them.
In a gamification approach, student engagement, teamwork, innovation and competitiveness come
in to play. With a game, subject matter is no more a theory, but practical and aids in students’
understanding. Students have to do significant research to come up with game rules and to
quantify the subject matter. Everyone is actively involved or engaged in one or more aspects of a
game resulting in high mental processing and thus leading to higher subject matter retention. This
is also an opportunity to demonstrate leadership and teamwork skills. This approach has been
implemented in both onsite and online courses. In addition, this SCL also makes learning fun and
provides opportunities for students to learn additional tools (e.g. Game board design).
Assessment-Centric SCL
In addition to the standard assessment tools (e.g. quizzes and exams), this particular SCL approach
centers around one or more course homework leading to a specific deliverable from the real world
implementation of an advanced topic like ISO14001 Environmental Management System or
Renewable Energy Alternatives. Students present both the theory and implementation.
This is a team oriented SCL in line with course learning outcomes. In fact, the students have to
come to speed on many of the ISO 14001 standard details on their own (from a secondary text
book for the course), research to find an industry implementation that meets the standards, and
present it to the whole class. The students are empowered to critique the implementation and
recommend changes or improvements to the current implementation. This again is an advanced
SCL at the graduate level helping further development of critical thinking skills and students are
motivated since it is assessment-centric. This SCL activity broadens their own career due to the
nature of work involved and learning. As most students would agree that just reading a standards
text book could become boring – but tied to real world implementation brings the theory live to
present and discuss with their cohorts.
8
Community-Centric SCL
The focus here is on learners themselves becoming teachers to some extent, and thus it raises the
level of mental processing (or learning) and increase interaction among learners. This can be an
individual or team SCL activity. As noted earlier, in the learning pyramid, it was recognized that
the highest retention (90%) occurs when teaching others is involved. This approach puts the
students in a different mind-set – to explain, to answer questions, etc. A teaching SCL activity
helps graduate students not only to improve their preparation & presentation skills but also
concept-articulation skills. This is very important for a leadership position or even for presenting
to a company C-level management. When student team prepares a presentation with the idea of
teaching the same to their classmates, the approach is thorough with strong team interactions. Idea
exchanges and teaching happens. All these inherently increase learning and retention.
Three SCL activities that are practiced in this category are: teaching sustainability principles with
an example, ice-breaker discussion activity and student selecting text book chapters to
present/teach.
In the first SCL activity, student teams (or individuals) present the theoretical concepts of a
sustainability principle (from a secondary course text book): its official name, definition, origin,
its specific principles and an application or implementation that demonstrates the principle.
Students need to really think deeply about the sustainability concept and be able to explain as if
they are teachers. This content is included in their exams.
In second SCL activity, called icebreaker discussions, is implemented primarily in MS
Engineering Management course where issues of management and customers relationships are
discussed at length and its importance. From a process perspective - the class is divided in two
sections, one section is the management and the other is the customer (role playing). A real-life
issue is presented by the instructor and students discuss the issue from their role’s point of view.
Each section presents a potential solution to the other and negotiates. The roles are switched for
another real-life issue so the students get to think beyond theory and to put forward practical
solutions. This real live simulation, facilitated by the instructor, promotes management and
negotiating skills at the graduate level.
In the third SCL activity in this category, student teams get to pick a chapter of their choice in the
primary text book and present it to the class (like teaching, of course) along with an real world
example highlighting the theme or core concept of the chapter (e.g. Engineering Ethics). The
students also get to vote picking a sub-set of these chapters that will be included in the exam – a
true democratic process of student empowerment - decide on what they are most interested in to
learn and to be assessed on.
9
Student Feedback on SCL Activities As a part of the initial implementation, it was very important to get students’ feedback on these
graduate level advanced SCL practices. Surveys were taken at the end of the courses where one or
more of these SCL practices were implemented. Students were informed about these activities
right in the beginning as parts of course outline along with assessment points allocations for
specific SCL practices (e.g. News of the Day). In these graduate level courses most of the students
are working adults and so their feedback is very significant and unique. The following charts
indicate the feedback from different courses – responses to specific questions asked in each of the
courses as it related to SCL practices.
The Charts in Figures 3 – 6 represent the student feedback on the various categories of SCL
practices shown with 1 standard deviation error bars, and sample sizes in each case (5 being the
most favorable/highest rating). It also should be noted that statistical ‘mode’ for all these feedback
was a 5.
Figure 3 Knowledge-Centric: Student Feedback on ‘News of the Day’ and ‘Debates’
4.61 4.57
0.00
1.00
2.00
3.00
4.00
5.00
6.00
News of the Day Debates
Average Rating (Highest is 5) Sample Size = 15
Average Rating
10
Figure 4 Learner-Centric: Student feedback on an array of questions for gamification –
Learning through Games; Game Design Methodology (GDM) is the gamification process.
Figure 5 – Assessment-Centric: Student Feedback on ‘ISO 14001 Presentations’
Sample Size = 25
Questions Relating to GDM
Rate the following attributes for the Game
design and creation activity (rating 1 to 5, with
5 being most favorable): SUS 601 Introduction to Sustainability - Student Feedback Survey on GDMa. Increased student Motivation
b. Increased student depth of learning
c. Increased student engagement
d. Increased team collaboration and
communication
e. Increased ‘inter’ and ‘intra’ team positive
competition
f. Increased student creativity and imagination
g. Learning was fun through ‘game creation
and play’
h. Expect better retention of learned material
through game creation and playing
i. Games help to demonstrate difficult
sustainability concepts
j. Increased tools knowledge and skills
(PowerPoint, spreadsheet, game website
research, etc.)
k. Game ‘creation and play’ improved student
critical thinking skills
l. Team presentation & game demo is an
effective method to communicate with the
class
m. The Game theory videos and tools at
www.nucatalyst.com was effective and useful
n. Game creation, playing , demonstration is
an effective method of learning
4.164.34 4.20 4.32 4.32
4.564.32 4.28 4.16
4.44 4.324.60
3.63
4.12
1.03 0.901.04
0.85 0.90 0.77 0.800.98 0.90 0.90 0.95
0.58
1.41
0.95
0.00
1.00
2.00
3.00
4.00
5.00
6.00
6a 6b 6c 6d 6e 6f 6g 6h 6i 6j 6k 6l 6m 6n
Rat
ing
-5 M
ost F
avor
able
Specific Game Related Qs (details on the right side)
SUS601 - Survey Feedback on Game Creation & Play
Ave Rating Std Dev
4.5
0
1
2
3
4
5
6
Average Rating
ISO14001 and Its Implementation Sample Size 20
ISO14001-CompanyPresentation
11
Figure 6: Community Centric – Feedback chapter presentations
Specific questions about the various traits of SCL (e.g. motivation, innovation, etc.) were asked in
the gamification example (Figure 4) – student feedback is overwhelmingly positive for these traits
(all above 4 rating).
It is clear from the above charts, that overall the SCL practices were very well received by the
students. They see the value in the practices through not only very good ratings on the surveys, but
also in their comments. As noted earlier, these practices are not widely implemented yet and these
are the results from our initial implementations. These results cannot be generalized for all
programs since there are other factors that could have an impact. The other variables are
Instructors, their style, facilitating and mediation skills on how these practices themselves are
presented for a given course – each Faculty is different and may differ by courses as well. More
research and data collection would be required which would include these factors. Another key
item to be researched to check if the students overall do get better grades when these practices are
implemented.
Online Education - A Catalyst for SCL Practices
The advent of online education in higher learning and its rapid growth in the last 10 years is
primarily based on the premise that students want to learn on their time (asynchronous), thus
demonstrating their empowerment. Online learning by design puts more responsibilities on
students shoulders for self-learning since there is no face-to-face lecture or class time per say
(except for some synchronous chat sessions depending on the course and the university). Some
4.85 4.92
0.00
1.00
2.00
3.00
4.00
5.00
6.00
SCL is an interesting and practicallearning tool and methodology
SCL Empowers students and helpstowards higher retention
Community-Centric Average Rating Sample Size - 13
12
blended (or hybrid) classes may have some face-to-face lecture times. Nevertheless, online class-
learning by default is a catalyst of self-learning.
According to Palloff & Pratt [8]
the following are specific online educational attributes and online
student responsibilities:
- Online learning is learner-centered and learner focused
- Learner focused online teaching needs a community amongst the learners
- Collaboration among the learners
- Instructors empower learners to take charge
The practices and techniques of SCL meet all the above online educational attributes and have the
potential to go beyond with technology as the enabler.
With the help of Internet technology, a huge step forward for online learning is the development
and deployment of Massive Open Online Courses (MOOCs). MOOC has been at the forefront of
leading US universities (e.g. MIT, Stanford, Harvard, UC Berkeley) with the goal of expanding
online education for free from the best universities and professors [9, 10]
. Registrations for MOOC
classes run in to tens of thousands of students from around the world. This mode of education is
another stage or platform for SCL practices to develop worldwide with self-motivation and student
empowerment being the key attributes. Although MOOCs initially started only for engineering or
technical classes, they are moving towards offering humanities related subjects, such as Sociology
and Anthropology.
More thoughts on Impact of Technology on SCL
It is no secret that the advancement of technology has had a huge impact in higher learning both in
the US and around the world. The impact of Internet technology was discussed earlier. It is not
uncommon to hear about social media’s potential that one day it might become the media of
instruction, not to mention the arrival of ebooks and its impact on the traditional text books and its
associated industry – a disruptive technology indeed. It can be argued that technology will make
education more open, mobile, social and analytical.
Figure 7 Ubiquitous Access to Information. Technology Enabled SCL Practices
13
There is more than that meets the eye – the overwhelming development of wireless network
seamlessly integrated with wired technology (e.g. cable modem, routers,) and the adoption of
wireless devices (e.g. cell phones, tablets, home Wi-Fi, hot spots) is further changing teaching and
learning logistics. IT departments in higher education are at crossroads deciding on the best
approaches that can support Faculty and students with high availability and security for a
multiplicity of devices. This adoption further reinforces and empowers students with the
ubiquitous access to information (see Figure 7). In the recent past the advancement of cloud
computing technology has taken access and sharing of information to another level furthering the
ubiquity around the world. The X and the millennial (or Y) generations will form the bulk of adult
higher education in the near future, have uniquely settled on these wireless technologies (and
social media) and are very comfortable with it. They are also looking for quick/immediate and
timely feedback on their class performance, and SCL practices can help to provide for this quick
feedback. This needs further changes in how Faculty can work under these new norms which will
be new a lot of the current faculty. They need to prepared and trained – university IT departments
and Learning Management Systems (LMS) will play a huge part in this transition.
Although MOOCs are in their early adoption stage, one can predict that it will lead the way for
higher education’s impact being more open, mobile, social and analytical further pushing the
student-empowerment-envelope with SCL practices.
Addressing Questions on SCL
As was discussed earlier, although student feedback on SCL practices has been positive, this needs
more research and data collection & analysis (larger sample sizes) done before generalizing them
for their effectiveness and large scale adoption in engineering education. Concerns and questions
will be from the Faculty themselves before they would implement the practices.
Texas A&M University Professors [11]
addressed various Faculty questions as it relates to SCL and
have documented several examples and practices that help the Faculty. They addressed faculty
questions like, ‘Can the content in the syllabus be covered using SCL learning approaches? Can
this approach be used for small and large classes?’ Their research also suggested solutions and
resources to address them: ‘how to respond to those students who might resist this approach and
how to help better team work’, etc. Again, implementing many of these for both online and onsite
classes would need to be tailored for each type of course offering.
It was observed during this initial implementation that generally students expect to be passive
when they come to an onsite class. It is a challenge to draw the online students in to highly
interactive discussions mainly due to time limitations for synchronous activities (online classes are
by default asynchronous). The online Discussion Board activity (an assessment vehicle) provides
one method to draw the online students to actively participate with open ended questions and
14
empowering them to initiate new topics for discussion. Other team SCL practices as discussed
above can also be implemented for online classes.
Instructor’s skill to articulate and mediate the specific SCL practices also plays a big role in
students buying-in and practicing SCL. Further, one size fits all approach may not work in certain
programs and the SCL practices may need to be tailored at the Instructor’s initiative and
innovation.
Conclusion
Universities can to do more to instill deep learning among their students in order to prepare them
for the twenty first century jobs. This is even more important for engineering education. Practicing
active learning through different SCL activities is a very practical way to instill critical thinking,
innovation, participation, collaboration or teamwork. Several advanced SCL practices
implemented in three Master’s level engineering program courses were presented and discussed in
this paper. It is clear from students’ feedback that the SCL practices were well received and the
students felt that they were empowered and it helped with better understanding of the course
concepts.
The practices discussed here were the initial implementation and the feedback are from limited
courses in three programs and as such the results cannot yet be generalized with a formal template
for future for all engineering programs. More research needs to be done with other factors such the
direct impact of the instructor and the mediation involved. Another area that needs further research
(for tools and techniques) is the ability to measure if the overall student performance (e.g.
individual and class average course grades) improves with consistent SCL practices and if they are
higher than those courses where SCL is not practiced.
Higher education will continue to experience big changes due to fast changing technology and its
influence will further result in empowering students with ubiquitous data becoming available
through Internet and other high-tech media.
15
References
1. National Research Council Report to Congress, http://www.nationalacademies.org/annualreport/ , accessed
December 15, 2013
2. Geraldine O’Neill and Tim McMahon; Student-Centred Learning: What does it mean for students and
Lecturers?; 2005
3. Craik, F. I. M., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal behavior , 11, 671-684.
4. Student-Centered Teaching Methods, PDF, http://www.princeton.edu/cst/teaching-
resources/methods/ModifiedPCASTTable.pdf , accessed December 17, 2013.
5. Collins, J.W., 3rd
, & O’Brien, N.P. (2003). Greenwood Dictionary of Education
6. National Research Council. (1999). How people learn: Brain, mind, experience and school. Washing, DC:
National Academy Press.
7. National Training Laboratories, Average Learning Retention Rates, www.ntl.org,
8. Palloff, R., & Pratt, K. (2005). The Role and Responsibility of the Learner in the Online Classroom
9. MOOC Campus, World’s First Residential Campus for Education, http://mooccampus.org/?gclid=CLXGgN-
V2bsCFUZxQgodfiAAmQ , accessed Dec. 30, 2013
10. Massive Open Online Course Platform, edX, https://www.edx.org/about-us , ‘We’re empowering learning in
the classroom and around the globe’, accessed December 23, 2013.
11. Froyd, J, Simpson N. (2010). Student-Centered Addressing Faculty Questions about Student-centered
Learning; http://ccliconference.org/files/2010/03/Froyd_Stu-CenteredLearning.pdf
12. T4SCL Report By European Student’s Union and Educational International, 2010, Stakeholders Forum-
Lueven -‘Student Centered Learning – An Insight Into Theory and Practice’
13. Blumberg, P. (2008) Developing Learner-centered teaching: A practical guide for faculty. San Francisco:
Jossey-Bass
Acknowledgement
The author wishes to express his sincere thanks and appreciation to all his students who were
important part in these SCL activities and its implementations in the various courses.
Paper ID #10737
Critical Thinking, Reflective Practice, and Adaptive Expertise in Engineering
Nathan Hicks, University of Florida
Current graduate student in materials science and engineering at the University of Florida. Spent threeyears teaching high school math and science before returning to graduate school for an advanced degree.
Amy Elizabeth Bumbaco, University of FloridaDr. Elliot P. Douglas, University of Florida
Elliot P. Douglas is Associate Professor of Materials Science and Engineering, Dean’s Fellow for Engi-neering Education, and Distinguished Teaching Scholar at the University of Florida. He conducts researchin the areas of engineering problem-solving, critical thinking, active learning, and qualitative methodolo-gies.
c©American Society for Engineering Education, 2014
Critical thinking, reflective practice, and adaptive expertise
in engineering
Abstract
This synthesis paper examines the concepts of critical thinking, reflective practice, and adaptive
expertise as represented throughout academic literature. The academic community generally
considers each of these skillsets to be desirable attributes of engineering graduates and
practitioners. Despite the trend of engineering programs across the country to embrace critical
thinking, reflective practices, and adaptive expertise through mission and vision statements, the
development of these qualities through education may be falling short. Lack of explicit exposure
to and discussion of each concept may be contributing to the common inability of engineering
students and educators to effectively communicate their understanding of each. In an attempt to
contribute to the improvement of the situation, this paper aims to provide an individual
evaluation of each topic as represented in the literature, a review of current operationalization
techniques, and the current state of each topic within the field of engineering. Additional
discussion builds connections by exploring relationships among the three topics, considers issues
related to the topics within engineering, and offers possible areas of future exploration.
Introduction
Mission and vision statements for universities and colleges across the country underline the
importance of critical thinking and related skills in higher education today.1-8 Without explicitly
using the phrase, sources such as ABET EAC and the National Academy of Engineering assert
the need for engineers to be well trained in critical thinking skills.9,10 However, a number of
researchers11-14 argue that many students show little to no gain in “critical thinking, complex
reasoning, and writing skills”11 over the course of their undergraduate educations. Despite
consensus that one of the primary goals of college faculty should be to promote critical thinking,
many professors fail to express a clear understanding of critical thinking or how to convey its use
to students.15 This represents a glaring roadblock on the path to producing effective engineers.
The difficulty in expressing a coherent understanding of critical thinking likely stems from the
variability present amongst its numerous descriptions.16-30 The definitions that exist lack an
empirical basis, but a review and analysis of the various concepts may provide a foundation for
discussion. Further, two additional topics may contribute significantly to the exploration of
critical thinking: reflective practice and adaptive expertise. Critical thinking, reflective practice,
and adaptive expertise have each received considerable attention individually in the academic
literature, however, there appears to be a strong and deep connection present between these
topics. Typically, each topic has been discussed in isolation or only in passing with respect to
one another, so previous instances attempting to relate and link the concepts remain limited at
best.
Ultimately, the goal of this paper is to begin a conversation about how a more thorough
understanding of critical thinking, reflective practice, and adaptive expertise in conjunction with
one another might contribute to the improved development of engineering students. To most
effectively construct these relationships and their importance to the field, the paper shall be
organized using the following structure: first, as the current literature typically considers each
topic in isolation, the standard definitions of each will be presented individually; next, because
the only way to determine the efficacy of our attempts to foster these abilities within our students
necessitates an ability to measure, the existing operationalization techniques for each concept
will be provided; subsequently, since improvements rarely occur without knowledge of the
present state of affairs, a review of each concept in the context of engineering and engineering
education will be considered; finally, all of the aforementioned content will be collectively
analyzed to explore the relationships between each topic, the potential shortcomings of
engineering education to sufficiently develop desirable skills, and how these shortcomings may
be addressed, as well as additional questions this analysis may have aroused.
Definitions
Critical thinking lacks a clear, exact, and consistent definition due primarily to its highly
philosophical nature. Some experts16-28 attempt to give broad definitions, ranging from a problem
solving methodology,19 to an information filtration process,28 to a simple ‘frame of mind.’20
Meanwhile, others define critical thinking through lists of specific skills related to reasoning,
logic, and strategies.18,29,30
While each individual’s definition and terminology differs, general trends tend to emerge.18 This
is perhaps best illustrated by Facione’s Delphi report31 in which 46 participants produced a
collaborative definition of critical thinking as “purposeful, self-regulatory judgment” for
“interpretation, analysis, evaluation, and inference,” leading to a set of six main skills with
corresponding sub-skills.
In addition to skills, several experts recognize that critical thinking involves a component of
disposition or spirit, which leads an individual to approach all phases of life with reason and
inquisitiveness.20,21,32 The generalizability of critical thinking generates a greater degree of
contention, but may be the most important consideration.18,22 Some experts believe critical
thinking cannot be developed in the absence of context23,29,33 and may vary in form by subject.23
Alternatively, others claim that while background knowledge may facilitate the process, critical
thinking may be taught in a neutral context.34,35
Reflective practices relate closely to the disposition component of critical thinking.36 Aristotle
began discussions of reflective practices, but Dewey, Heidegger, and Schӧn receive the most
credit for developing the theory.37-44 The most important concepts involve transforming an
unfamiliar or unexpected situation or surprise into something familiar by improvising a response
using a ‘reflective conversation’ – ‘reframing’ the situation, considering possible actions, and
‘listening’ to the situation’s ‘backtalk’ in an iterative loop. Schӧn suggested that individuals
participate in non-reflective thought (or knowing-in-action), post-mortem reflection (or
reflection-on-action), and in situ reflection (or reflection-in-action).42
The well-known foundations laid by Dewey, Heidegger, and Schӧn, however, apparently lack
critical analysis and ironically fail to reflect upon themselves.45,46 The call for a more reflective,
critical analysis of reflective practice produced both practical and philosophical developments.47
These developments have painted a clearer picture of levels of awareness, forms of surprise,
bases of improvisation,48 modes of reflection (based on levels of engagement,48,49 temporal
aspects,50 epistemic purpose,47,51 ‘images’,52 and needs for extensions53), and differences
between reflectivity and the deeper, more self-aware reflexivity.51,53-55
Developments have also unearthed potential limitations related to the use and study of reflective
practices: man tends to avoid error and suppress negative feelings;56 different cultures may
possess different viewpoints toward reflection;47 the standard utilitarian mentality potentially
prevents anything beyond a practical grounding of reflection;53 and finally, reflection may be
fruitless if practiced individually rather than through discourse in a group setting.47 Despite these
limitations, reflective practices are still considered extremely useful in research and professional
development,51 as long as the practitioner employs it appropriately based on experience and
background knowledge.46,51
Adaptive expertise consists of two core concepts: expertise and transfer. General or routine
experts have extensive domain-specific knowledge and experience,57 making them efficient,
accurate, and fast with specific types of problems.58 Transfer represents the ability of an
individual to apply concepts learned in one context to a different, usually similar, context.59
Adaptive experts, therefore, are like routine experts, but with the ability to transfer their skills.
While general experts possess strong procedural knowledge, adaptive experts also possess strong
conceptual knowledge.60 Thus, adaptive experts utilize their understanding to flexibly adapt
previous mental models to new situations.61
Adaptive experts are both highly efficient and highly innovative, while routine experts are
merely highly efficient.62 This difference derives from the adaptive expert’s use of multiple
perspectives and metacognition, as well as a disposition toward more rigorous learning.63
Unfortunately, time constraints within the learning environment may lead to preferential
adoption of procedural knowledge over conceptual knowledge, significantly hindering the
development of an adaptive expert.58,60
Operationalization
In order to determine the degree to which engineers utilize critical thinking, reflective practices,
and adaptive expertise and, more importantly, how engineering students develop these skills,
measurement techniques for each prove necessary. A number of methods currently exist, though
the lack of an empirical basis for what constitutes each topic imparts a degree of imprecision and
uncertainty. Nonetheless, a quick review of the present operationalization techniques follows to
provide background for subsequent discussion.
Researchers and practitioners have developed a variety of operationalization techniques for
critical thinking over the years. Several groups have developed guides, frameworks, rubrics, and
models to represent the skills of critical thinking – some based on expert opinions,64-66 others
derived through surveys of faculty perception of students.29,67,68 Studies frequently measure
student critical thinking through pre- and post- tests of relevant critical thinking skills,69-76
though others use qualitative interviews and observations to determine the students’ learning and
perceptions.70,77-79
Though some of these methods consider critical thinking specifically within the context of
engineering,65,68-79 other methods attempt to measure general critical thinking skills, which can
presumably predict future academic successes. These techniques are often based on multiple
choice tests, the most common of which are the California Critical Thinking Skills Tests
(CCTST),80 the Watson-Glaser Critical Thinking Appraisal (Watson-Glaser CTA),81 and the
Cornell Critical Thinking Test (CCTT).82 These tests have the potential to be useful for the
general college population,83 but critics claim the norm groups were insufficient.83-90 Still, the
Watson-Glaser CTA is generally the most accepted,84-87 but the ultimate value of each instrument
depends on the user’s agreement with the corresponding operationalization items.89
Reflective practices often occur internally, thereby posing a challenging obstacle to measuring
the attribute, but fortunately, verbalization facilitates reflection. The first attempt91 to measure
reflective practices consisted of a pencil-and-paper based test, analyzing a number of related
attributes, and while a few currently applied techniques are test-like questionnaires,92-94 the more
accurate methods rely on interviews49,95 and participant journals.96 The interviews and journal
analysis methods can determine an individual’s reflective maturity fairly well, but require
significant amounts of time to transcribe and code, and therefore lack large scale applicability,
which is not an issue for the questionnaire style methods.
Adaptive Expertise has received less attention regarding operationalization. The How People
Learn (HPL) Star Legacy Cycle establishes a framework that includes expertise and transfer, and
therefore serves as a decent template for adaptive expertise.97 Hatano’s work formed the basis for
a variety of rubrics to measure adaptive expertise in classroom settings.98-100 Additional
techniques compare pre- and post-tests99-104 as well as devise equations.104,105 Qualitative
interviews with students106 and surveys with students or faculty107 can also provide an indication
of a student’s knowledge adaptation.
Implementation in Engineering
Exploration of these topics within the context of engineering often focuses either on how each
topic pertains to professional engineers or, more commonly, how each can be implemented into
the engineering curriculum and assessed. An emphasis on understanding how to optimally
educate engineers to be critical, reflective thinkers and adaptive experts should enable academia
to produce higher quality practitioners who can contribute to their fields earlier in their careers.
Unfortunately, the ever increasing load of content knowledge delivered to students, the current
delivery methods for that content, and large class sizes significantly limit the ability of students
to adequately develop these skills.36,108,109 Ultimately, the overall curriculum of engineering may
require drastic changes to engage and challenge students to inquire and solve problems rather
than to simply inform students what and how to think.69,110 Of course the use of active learning
techniques fosters these skills, but implementation is often conducted without explicit attention
to critical thinking and reflection, a point we come back to in the Discussion.111
For critical thinking, the fact that engineering faculty tend to lack a clear, explicit understanding
of the concept prevents students from acquiring a proper conceptualization of their own.109
Various researchers have attempted to combat this obstacle by developing models and tables or
lists of skills to teach and assess critical thinking.29,36,109,112,113 With or without assistance of these
models, academics have tried to incorporate critical thinking into the curriculum in the following
ways: inclusion of stand-alone critical thinking courses;112,114,115 emphasis on design based
learning and problem based learning;73,108-110,112-114,116,117 and infusion of writing assignments
into coursework.78,110,118,119 These attempts received mixed reviews from students in terms of
preference, efficacy, and importance.70,78,112,113,118,120
A number of studies have compared critical thinking ability to various demographic variables
and learning orientations. According to one study, a student’s cultural background strongly
impacts the expression of critical thinking skills.121 The same study reported that students at
predominantly black universities experienced more widespread development and that Asian
students struggled to think critically. Another study reported higher levels of critical thinking for
males than females.122 Other studies have indicated positive correlations between critical
thinking and information literacy,110 self-efficacy, and effort,122 no correlation between critical
thinking and problem based learning,73 and a negative correlation between critical thinking and
achievement/grade focused learning.120 Further studies indicate that individuals appear to
implement critical thinking differently,77 and critical thinking may or may not be generalizable
across disciplines.78
The results of these studies could lead to further understanding of how critical thinking develops
in engineering students. However, it should also be pointed out that the results of many studies
were inconsistent or insignificant,69,70,75,76 possibly due to failure of the testing apparatuses to
truly measure critical thinking.71 Until improved measures are developed and these
inconsistencies are resolved, the implications of these findings for classroom practice remain
unclear.
Reflective practices provide the bedrock for engineering ethics,123 but also serve as a defining
characteristic for success as an engineer due to the ambiguous and qualitative nature of problems
within the field.124,125 As these problems are often highly contextual and yet decidedly unique,
poorly structured and ill-defined (or ‘wicked’126), formal logic occasionally does not suffice, so
engineers must frequently employ reflection in their judgment.40,127-129 Additionally, the virtual
experimentation of the design process, a critical element in many engineering disciplines,
perfectly exemplifies Schӧn’s reflective conversation and other views of reflection.130-132 And
perhaps more importantly, a critical evaluation of reflection within engineering, as initiated by
van Gyn,66 may lead to positive changes and challenges to existing power structures.128
In addition to benefiting the practicing engineer, reflective practices appear to contribute to the
development of specific competencies and transferrable skills and to the transformation of values
and attitudes.108,128,133,134 However, debate exists regarding the need to teach reflection. Some
experts claim that reflection is a purely innate disposition that improves over time without
explicit instruction,135 while others advocate the need to foster the behavior to achieve optimum
results.36,95,128,132,136,137 Unfortunately, instead of practicing proper reflection, many engineering
students and instructors tend to focus on concrete events and often fail to improve adequately
due to the lack of established mechanisms to measure growth.132,138-140
Attempts to infuse reflection into the engineering curriculum fall into four primary categories:
verbally induced reflection; experiential reflection; retrospectively analytical reflection; and
academically emancipative reflection. Verbally induced reflection is the processing of technical
information into language or vice versa and includes the use of: journals;124,129,141,142
notebooks;138,143,144 papers, reports, and learning essays;124,129,139,145-147 reflective readings;147
group reflective discussion;147,148 question-answer-techniques;149 and direct mentorship.150,151
Over time, students value these techniques124 and show growth of engineering maturity and
epistemology,143 but tend to mirror the perceptions and values of their instructors.138,150
Experiential reflection refers to instances in which students reflect on situations experienced
directly, virtually, or vicariously, such as: games or simulations;144,151-153 problem based learning,
project oriented learning, case studies, and combinations thereof;133,140,154,155 design based
learning;132,156 service learning;141,157,158 internships;142 and development of programs and
software.159 Retrospectively analytical reflection seeks to determine relationships between
previously obtained knowledge and experiences, including: creating diagrammatic
representations of processes or concepts;145,148,160 incorporating computer-based, student
developed, or peer- and self-assessments;161-163 and creating group reconstructed representations
of experiences.146 Academically emancipative reflection questions the very foundation of the
current engineering education paradigm through modification of content, courses, and curricula.
Engineering content can be delivered through web-based systems that prompt and foster user
reflection;139,164 entire courses can be designed around reflection129,165 or taught in more
interactive or novel formats;166,167 and overall curricular design can be built on reflection.145,168
Each of these latter three groupings of reflection involve the first or each other and present their
own challenges and benefits.
As technologies advance, fields become increasingly interdisciplinary, and globalization
continues, the need for engineers to be adaptive experts continues to grow.169 The majority of
educational programs develop routine expertise but fail to address adaptability.169-172 Other fields
have attempted to ameliorate this deficiency by integrating training, specifically in unpredictable
environments that offer opportunities to adapt by linking previous knowledge to current
situations.173,174 Most adaptive expertise studies within engineering have been in bioengineering
and related areas63,74,99-101,103,104,175,176 and have employed the previously mentioned HPL Star
Legacy technique,100,101,104,171,176 challenge based instruction,170 and design scenarios.95,99,175,177
Students matriculated in these courses generally showed growth in adaptive expertise during the
course98,99,102 and in longitudinal studies compared to students who did not take these
courses.98,99 Scholars suggested that growth in adaptive expertise relates to improvement in
innovative solutions,99 general knowledge,100 factual knowledge,101,104 and conceptual
knowledge104 and can serve as tools for measurement.
Discussion
The first matter of discussion pertains to the relationship among the three topics. The skills
associated with critical thinking, reflective thinking, and adaptive expertise are most certainly all
qualities an engineer should embody, but working through the connections may enable the
academic community to instill those skills more effectively. First and foremost, it seems that
critical thinking resides at the base of each of the other two concepts. The act of thinking
reflectively certainly represents a modality of critical thinking. Considering the level of
engagement classifications of reflective thinking provided by Schӧn and others,42,48 it might
appear that any thinking that is non-reflective is also non-critical and that all reflective thinking
is also critical. Logically, this distinction would also suggest that all critical thinking must be
reflective. However, an alternative classification system provided by Kember et al.,92,96 states
that ‘understanding’ occurs at a level above habitual action but below reflection, which may
coincide with King and Kitchener’s ‘quasi-reflective’ thinking.49 This quasi-reflective level may
be critical in nature, but is clearly somewhat reflective as well, so it is somewhat unclear as to
whether critical thinking and reflective thinking can occur independently or if they are
completely entwined.
Additionally, adaptive experts require conceptual knowledge beyond just procedural knowledge
and then must transfer that knowledge to new situations, and hence, reframe the situation. Thus,
even if the fuzzy distinction between critical thinking and reflective thinking is removed, an
adaptive expert very clearly must employ both to obtain areas of deep conceptual understanding
and to be capable of reframing that knowledge to fit a new circumstance. The fact that an
adaptive expert must be skilled at employing both critical and reflective thinking, the ultimate
goal of engineering academia should be to develop adaptive experts. Still, what it means to think
critically and reflectively should be addressed somewhere, preferably early, along the way.
There are, however, other common threads amongst the three topics. Almost every description of
critical thinking, reflective thinking, and adaptive expertise makes a point to mention some
aspect of disposition. This suggests that some people may just be more naturally inclined to be
critical thinkers, reflective thinkers, or adaptive experts. But does that mean that some
individuals are less capable? Perhaps an examination of incentives to engage in critical and
reflective thinking may be necessary to analyze this question, and more importantly, to find ways
to encourage students to practice those skills. Another common thread – lifelong learning –
clearly also requires this disposition.
A neurological approach to these comparisons could shed further light on the subject. A study
conducted by Alexiou, Zamenopoulos, and Gilbert analyzed fMRI images of peoples’ brains as
they solved design vs. non-design problems.178 When participants solved design problems, brain
activity occurred in different areas than for analogous problems that lacked the design element.
As noted previously, design is a significant component to engineering, and is strongly associated
with critical and reflective thinking. It may be useful to study more fMRI images of individuals
as they solve problems that are expected to promote critical thinking, reflection, or transfer in
adaptive experts. Additional considerations could include comparing brain activity when
answering questions that are said to promote critical thinking in a variety of subject areas. This
information may help establish similarities and differences in each process.
Addressing how engineering programs should aim to produce adaptive experts who are strong
critical and reflective thinkers requires the discussion of current engineering education issues and
questions related to all of these topics, not all of which have easy or currently available answers.
Perhaps most importantly, as Mina, Omidvar, and Knott emphasize, advances in technology and
scientific knowledge have led to an ever increasing amount of content being taught to
students.108 Teaching students more in the same amount of time leads to the preference of
procedural knowledge that prevents adaptive expertise.58,60 Additionally, the increased content
load discourages professors from incorporating active learning strategies into the classroom that
explicate reflective or critical thinking. A student might have all the knowledge in the world, but
if they are never given an opportunity to learn it at a conceptual level and integrate it with
previous knowledge or to practice transferring that knowledge to new situations, that knowledge
is useless. Thus, is it better to develop skills to become adaptive experts and hope students learn
more content knowledge later in their careers, or better to deliver the content and hope students
become effective thinkers later?
This question also presents another debate. If there are currently professional engineers who are
adaptive experts and thinking critically and reflectively, without having an undergraduate
curriculum that emphasizes those concepts, do they even need to be emphasized? If they
absolutely cannot be taught, as Edwards and Thomas suggest,135 then spending the time to do so
would certainly be wasteful. However, the heavy influence of disposition might suggest that
taking time to foster and encourage critical and reflective thinking could strengthen that
disposition. This approach should likely occur most heavily at earlier stages of education to most
effectively instill a stronger propensity and desire to think in these ways. Certainly, a better
balance between developing skills and delivering content must be struck.
In order to effectively promote and foster critical and reflective thinking, instructors need a
clearer idea of what constitutes each concept. An empirical approach to defining each term,
specifically within the field of engineering, could be beneficial. This can be recognized through
other fields that have a more consistent and concise understanding than that held by individuals
in the field of engineering.109 Part of the reason other fields have a stronger understanding of
critical thinking is likely a consequence of increased and repeated exposure to the concept.
Engineering programs, on the other hand, often claim to develop critical and reflective thinking,
but fail to explicitly address what each means and how they intend to achieve this outcome,
perpetuating the issue. Interestingly, an effort to do so might possibly attract a more diverse
student body,157 which can only benefit the field as a whole.
It is understandable, however, why these qualities have not received significant previous
attention in the curriculum. The tools that exist to measure the development of each quality are
not universally adopted, may not be highly relevant for each discipline, and may be extremely
time intensive. Committing time to strengthen qualities is undesirable when their growth cannot
be easily measured. Consequently, a concerted effort should be placed on the operationalization
of critical thinking, reflective thinking, and most importantly, adaptive expertise before any
significant advancements in implementation can be expected. If it is found that these skills vary
by subject, then the development of discipline-specific, efficient methods for measurement can
improve training significantly. If critical and reflective thinking can be strengthened in a neutral,
general setting, and efficient tools can measure growth, then each process should be emphasized
early in each student’s education. Either approach should lead to engineering graduates who are
stronger adaptive experts.
Conclusion
The importance of critical and reflective thinking in the field of engineering cannot be argued.
The goal of developing adaptive experts who excel at thinking critically and reflectively is an
admirable and important goal in engineering education. Engineers with training in critical and
reflective thinking should be more capable in the increasingly complex, global landscape and
will be more mindful of their impacts on society. While it appears that some individuals are more
prone to be critical and reflective thinkers than others, and the skills may develop on their own
with age and maturity, placing an emphasis on fostering those abilities can potentially attract
students who may have otherwise rejected the field and should increase the speed at which new
graduates can make meaningful contributions to their field.
Still, not all the concepts have been adequately established and developed. While engineering
programs can attempt to improve their curricula to develop adaptive experts, until the entire
subject has been fully illuminated through empirical research, attempts will only be speculative.
Similarly, without adequately efficient and effective measurement techniques, it may not be
reasonable to pursue these goals as the results of any implementation would lack sufficient
evidence to properly support its claims.
Further, this introduces yet another element that should be part of the overall curriculum and may
seem overwhelming given the constant increase in available content to be taught. The task to
determine how to effectively incorporate opportunities to strengthen critical thinking, reflective
practice, and adaptive expertise may be difficult, but certainly needs to be addressed. Specific
courses could be taught with the pure intention of developing each skill, or the skills could be
sprinkled into other courses throughout the entire curriculum by making classes more hands on
or interactive, by including journals with the explicit purpose of reflecting, or using any of the
previous techniques mentioned. Even simple repeated exposure to the topics should produce
improvements.
It is also apparent that these considerations may produce even more questions, many of which
may be difficult or impossible to answer. This path may be arduous and fraught with growing
pains. However, no matter how these issues are addressed, the education of engineers can only
benefit from a thoughtful effort of faculty to engineer the education system.
Bibliography
1. Mission, Vision, and Values | Industrial and Systems Engineering | Virginia Tech. at
<http://www.ise.vt.edu/About/MissionVisionValues/MissionVisionValues.html>
2. Mission Statement | Harvard University. at <http://www.harvard.edu/faqs/mission-statement>
3. Calvin College. at <http://www.calvin.edu/academic/engineering/about/mission.html>
4. Mission Statements | Michigan Engineering. at
<http://www.engin.umich.edu/college/academics/bulletin/depts/cee/mission>
5. cms. Mission Statement — UCLA Mechanical and Aerospace Engineering. at
<http://www.mae.ucla.edu/about>
6. Mission & Vision Statement | Department of Electrical & Computer Engineering | Daniel Felix Ritchie
School of Engineering & Computer Science | University of Denver. at
<http://www.du.edu/rsecs/departments/ece/missionvision.html>
7. Murray State University > Engineering and Physics Department Mission Statement. at
<http://www.murraystate.edu/academics/CollegesDepartments/CollegeOfScienceEngineeringandTec
hnology/CollegeOfSciencePrograms/EngineeringPhysics/EPHYmission.aspx>
8. CEE Department Mission Statements | The Charles E. Via, Jr. Department of Civil and Environmental
Engineering | Virginia Tech. at <http://www.cee.vt.edu/academics/academics_mission_statements.html>
9. National Academy of Engineering. The engineer of 2020: visions of engineering in the new century. (National
Academies Press, 2004).
10. National Academy of Engineering. Educating the engineer of 2020: adapting engineering education to the
new century. (National Academies Press, 2005).
11. Arum, R. & Roksa, J. Academically Adrift: Limited Learning on College Campuses. (University of Chicago
Press, 2011).
12. Crabbe, N. Study: College students fail to think critically. The Chalkboard. (2011). at
<http://chalkboard.blogs.gainesville.com/2011/01/study-college-students-fail-to-think-critically/>.
13. Jaschik, S. “Academically Adrift” | Inside Higher Ed. High. Ed. (2011). at
<http://www.insidehighered.com/news/2011/01/18/study_finds_large_numbers_of_college_students_don_t_l
arn_much>.
14. Leef, G. No Work, All Play, Equals a Job? – Room for Debate. New York. (2011). at
<http://www.nytimes.com/roomfordebate/2011/01/24/does-college-make-you-smarter/no-work-all-play
equals-a-job>.
15. Paul, R. The State of Critical Thinking Today. (2004). at
<http://www.criticalthinking.org/pages/the-state-of critical-thinking-today/523>.
16. Mason, M. Critical thinking and learning. Educ. Philos. Theory 39, 339–349 (2007).
17. Ennis, R. H. A taxonomy of critical thinking dispositions and abilities. (1987). at
<http://psycnet.apa.org/psycinfo/1986-98688-001>
18. French, J. N. & Rhoder, C. Teaching Thinking Skills: Theory and Practice. New York Garland Pub. (Inc,
1992).
19. Facione, P. A. Critical thinking: What it is and why it counts. Millbrae CA Calif. Acad. Press Retrieved April
1, 2004 (2011).
20. Beyer, B. K. Practical strategies for the teaching of thinking. (ERIC, 1987). at
<http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED288824>
21. Norris, S. P. Synthesis of research on critical thinking. Educ. Leadersh. 42, 40–45 (1985).
22. Norris, S. P. The generalizability of critical thinking: Multiple perspectives on an educational ideal. (Teachers
College Press, 1992).
23. Willingham, D. T. Critical Thinking: Why Is It So Hard to Teach? Arts Educ. Policy Rev. 109, 21–32 (2008).
24. Yinger, R. J. Can we really teach them to think? New Dir. Teach. Learn. 1980, 11–31 (1980).
25. Paul, R. W. Critical Thinking: Fundamental to Education for a Free Society. Educ. Leadersh. 42, n1 (1984).
26. Walsh, D. & Paul, R. W. The Goal of Critical Thinking: from Educational Ideal to Educational Reality. (1986).
at <http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED295916>
27. Dressel, P. L. General Education: Explorations in Evaluation: the Final Report of the Cooperative Study of
Evaluation in General Education of the American Council on Education. (Greenwood Press, 1954).
28. Nickerson, R. S., Perkins, D.> & Smith, E.E. The teaching of thinking. (L. Erlbaum Associates, 1985).
29. Jegede, O. J. & Noordink, P. The Role of Critical Thinking Skills in Undergraduate Study as Perceived by
University Teachers across Academic Disciplines. (1993). at
<http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED362122>
30. Stein, B., Haynes, A., Redding, M., Ennis, T. &Cecil, M. Assessing critical thinking in STEM and beyond.
Innov. E-Learn. Instr. Technol. Assess. Eng. Educ. 79-82 (2007).
31. Facione, P. A. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment
and Instruction. Research Findings and Recommendations. (1990). at
<http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED315423/>.
32. Facione, P. A. Critical thinking: What it is and why it counts. Millbrae Ca Calif. Acad. Press Retrieved Febr.
26, 2007 (1998).
33. McPeck, J. E. Critical thinking and subject specificity: A reply to Ennis. Educ. Res. 19, 10–12 (1990).
34. Ennis, R. H. Critical thinking and subject specificity: Clarification and needed research. Educ. Res. 18, 4–10
(1989).
35. Siegel, H. The generalizability of critical thinking. Educ. Philos. Theory 23, 18–30 (1991).
36. Romkey, L. The development and assessment of critical thinking for the global engineer. in Proc. 2009 Am.
Soc. Eng. Educ. Conf. (2009).
37. Bulman, C. An introduction to reflection. Reflective Pr. Nurs. Oxf. Blackwell 1–24 (2013).
38. Dewey, J. How we think. (D.C. Heath & Co., 1910).
39. Dewey, J. How we think: a restatement of the relation of reflective thinking to the educative process. (D.C.
Heath and company, 1933).
40. Dias, W. P. S. Heidegger’s Resonance with Engineering: The Primacy of Practice. Knowl.-Based Syst. 20,
382–387 (2006).
41. Heidegger, M. M., John; and Robinson, Edward, trans. Being and Time. (Harper & Brothers, Tubingen:, 1962).
42. Schön, D. A. The Reflective Practitioner: How Professionals Think in Action. (Basic Books, 1983).
43. Schön, D. A. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the
Professions. (Wiley, 1987).
44. Schön, D. A. The Reflective Turn: Case Studies in and on Educational Practice. (Teachers College Press,
Teachers College, Columbia University, 1991).
45. Bleakley, A. From reflective practice to holistic reflexivity. Stud. High. Educ. 24, 315–330 (1999).
46. Ecclestone, K. The Reflective Practitioner: Mantra or a Model for Emancipation?. Stud. Educ. Adults 28, 146–
61 (1996).
47. Raelin, J. A. Public Reflection as the Basis of Learning. Manag. Learn. 32, 11–30 (2001).
48. Yanow, D. & Tsoukas, H. What is Reflection-In-Action? A Phenomenological Account. J. Manag. Stud. 46,
1339–1364 (2009).
49. King, P. M. & Kitchener, K. S. Reflective Judgment: Theory and Research on the Development of Epistemic
Assumptions Through Adulthood. Educ. Psychol. 39, 5–18 (2004).
50. Loughran, J. J. Developing reflective practice: Learning about teaching and learning through modelling.
(Routledge, 2002).
51. Lipp, A. Developing the reflexive dimension of reflection- a framework for debate. Int. J. Mult. Res.
Approaches 1, 18–26 (2007).
52. Ratkic, A. Images of reflection: on the meanings of the word reflection in different learning contexts. Ai Soc.
(2012).
53. Bleakley, A. From reflective practice to holistic reflexivity. Stud. High. Educ. 24, 315–330 (1999).
54. Steier, F. E. Research and reflexivity. (Sage Publications, Inc, 1991).
55. Finlay, L. & Gough, B. Reflexivity: A practical guide for researchers in health and social sciences. (Wiley.
com, 2008).
56. Argyris, C. & Schon, D. A. Theory in practice: Increasing professional effectiveness. (Jossey-Bass, 1974).
57. Hatano, G. & Oura, Y. Commentary: Reconceptualizing school learning using insight from expertise research.
Educ. Res. 32, 26–29 (2003).
58. Hatano, G. & Inagaki, K. Two courses of expertise. 乳幼児発達臨床センター年報 Res. Clin. Cent. Child
Dev. Annu. Rep. 6, 27–36 (1984).
59. Kimball, D. R. & Holyoak, K. J. Transfer and expertise. Oxf. Handb. Mem. 109–122 (2000).
60. Hatano, G. Cognitive consequences of practice in culture specific procedural skills. Q. Newsl. Lab. Comp.
Hum. Cogn. 4, 15–18 (1982).
61. Lin, X., Schwartz, D. L. & Bransford, J. Intercultural adaptive expertise: Explicit and implicit lessons from Dr.
Hatano. Hum. Dev. 50, 65–72 (2007).
62. Schwartz, D. L., Bransford, J. D. & Sears, D. Efficiency and innovation in transfer. Transf. Learn. Mod.
Multidiscip. Perspect. 1–51 (2005).
63. Fisher, F. T. & Peterson, P. L. A tool to measure adaptive expertise in biomedical engineering students. in
Proc. 2001 Am. Soc. Eng. Educ. Annu. Conf. Albuq. Nm (2001).
64. Paul, R. & Elder, L. The Miniature Guide to Critical Thinking-Concepts and Tools. 2, (Foundation Critical
Thinking, 2001).
65. Paul, R., Niewoehner, R. & Elder, L. The thinker’s guide to engineering reasoning. (Foundation Critical
Thinking, 2006).
66. Van Gyn, G., Ford, C. & Society for Teaching and Learning in Higher Education. Teaching for critical
thinking. (Society for Teaching and Learning in Higher Education = Société pour l’avancement de la
pédagogie dans l’enseignement supérieur, 2006).
67. Powers, D. E. & Enright, M. K. Analytical reasoning skills in graduate study: Perceptions of faculty in six
fields. J. High. Educ. 658–682 (1987).
68. Ralston, P. & Bays, C. Refining a Critical Thinking Rubric for Engineering. in Proc. Asee Natl. Conf. Expo.
Louisville Ky Pap. Ac 1518, (2010).
69. Donawa, A., Martin, C. & White, C. Re-engineering engineering: Teaching Students How to Think Critically.
in Proc. 2007 Am. Soc. Eng. Educ. Conf. (2007).
70. Donawa, A. Impact of critical thinking instruction on minority engineering students at a public urban higher
education institution. in Proc. 2011 Am. Soc. Eng. Educ. Conf. (2011).
71. Douglas, E. P. Critical Thinking Skills of Engineering Students: Undergraduate vs. Graduate Students. in Proc.
2006 Am. Soc. Eng. Educ. Conf. (2006).
72. Fleming, J., Garcia, N. & Morning, C. The critical thinking skills of minority engineering students: An
exploratory study. J. Negro Educ. 437–453 (1995).
73. Polanco, R., Calderon, P. & Delgado, F. Effects of a Problem-Based Learning Program on Engineering
Students. (2001). at <http://eric.ed.gov/ERICWebPortal/recordDetail?accno=ED453234>
74. Rivale, S. R., Martin, T. & Diller, K. R. Gender Differences in Adaptive Expertise: Evaluation of a challenge
based HPL biomedical engineering curriculum. in 9th Int. Conf. Eng. Educ. San Juan Pr (2006).
75. Lewis, J. E. & Bays, C. Undergraduate Engineering Students and Critical Thinking: A Preliminary Analysis. in
Am. Soc. Eng. Educ. 2011 Annu. Conf. Expo. (2011).
76. Lewis, J. E., Hieb, J. & Wheatley, D. Introducing Critical Thinking to Freshman Engineering Students. in 2010
Annu. Conf. Expo. (2010).
77. Douglas, E. P. The Practice of Critical Thinking Among Engineering Students. in Proc. 2009 Am. Soc. Eng.
Educ. Conf. (2009).
78. Melles, G. Teaching critical appraisal skills to postgraduate, English as a second Language, engineering
students. Aust. J. (2008).
79. Ceylan, T. & Lee, W. L.‘Critical Thinking And Engineering Education’. in Sect. Conf. Am. Soc. Eng. Educ.
(2003).
80. Jacobs, S. S. Technical characteristics and some correlates of the California Critical Thinking Skills Test,
Forms A and B. Res. High. Educ. 36, 89–108 (1995).
81. Watson, G. B. & Glaser, E. M. Watson-Glaser Critical Thinking Appraisal: Manual. (Psychological
Corporation, 1980).
82. Ennis, R. H., Millman, J. & Tomko, T. N. Cornell Critical Thinking Tests Level X & Level Z: Manual.
(Midwest Publications Pacific Grove, Ca., 1985).
83. Lambert, M. Review of the California Critical Thinking Skills Test. Eighteenth Ment. Meas. Yearb. (2007).
84. Jacobs, S. S. The Equivalence of Forms A and B of the California Critical Thinking Skills Test. Meas. Eval.
Couns. Dev. 31, 211–22 (1999).
85. Berger, A. Review of Watson-Glaser critical thinking appraisal. Ninth Ment. Meas. Yearb. 1692–1693 (1985).
86. Hughes, J. N. Review of the Cornell Critical Thinking Tests. Elev. Ment. Meas. Yearb. 241–243 (1992).
87. Malcolm, K. K. Review of the Cornell Critical Thinking Tests. Elev. Ment. Meas. Yearb. 243–244 (1992).
88. Martin, W. E., Jr. Review of the California Critical Thinking Skills Test. Eighteenth Ment. Meas. Yearb.
(2007).
89. Helmstadter, G. C. Review of Watson-Glaser critical thinking appraisal. Ninth Ment. Meas. Yearb. 1693–1694
(1985).
90. McMillan, J. H. Enhancing college students’ critical thinking: A review of studies. Res. High. Educ. 26, 3–29
(1987).
91. Johnson, A. An experimental study in the analysis and measurement of reflective thinking. Speech Monogr. 10,
83–96 (1943).
92. Kember, D. et al. Development of a questionnaire to measure the level of reflective thinking. Assess. Eval.
High. Educ. 25, 381–395 (2000).
93. Dunn, L. & Musolino, G. M. Assessing reflective thinking and approaches to learning. J. Allied Health 40,
128–136 (2011).
94. Woerkom, M. van & Croon, M. Operationalising critically reflective work behaviour. Pers. Rev. 37, 317–331
(2008).
95. Adams, R. S., Turns, J. & Atman, C. J. Educating effective engineering designers: the role of reflective
practice. Des. Stud. 24, 275–294 (2003).
96. Kember, D. et al. Determining the level of reflective thinking from students’ written journals using a coding
scheme based on the work of Mezirow. Int. J. Lifelong Educ. 18, 18–30 (1999).
97. Bransford, J. D., Brown, A. L., & Cocking, R. R. How People Learn: Brain, Mind, Experience, and School:
Expanded Edition. (National Academy Press, 2000).
98. McKenna, A. F. An investigation of adaptive expertise and transfer of design process knowledge. J. Mech.
Des. 129, 730–734 (2007).
99. Walker, J. M., Cordray, D. S., King, P. H. & Brophy, S. P. Design scenarios as an assessment of adaptive
expertise. situations 1, 2 (2006).
100. Martin, T., Petrosino, A. J., Rivale, S. & Diller, K. R. The development of adaptive expertise in biotransport.
New Dir. Teach. Learn. 2006, 35–47 (2006).
101. Martin, T., Rayne, K., Kemp, N. J., Hart, J. & Diller, K. R. Teaching for adaptive expertise in biomedical
engineering ethics. Sci. Eng. Ethics 11, 257–276 (2005).
102. McKenna, A. F., Colgate, J. E., Olson, G. B. & Carr, S. H. Exploring adaptive Expertise as a target for
engineering design education. in Proc. Idetccie 1–6 (2006).
103. Rayne, K. et al. The development of adaptive expertise in biomedical engineering ethics. J. Eng. Educ. 95,
165–173 (2006).
104. Pandy, M. G., Petrosino, A. J., Austin, B. A. & Barr, R. E. Assessing adaptive expertise in undergraduate
biomechanics. J. Eng. Educ. 93, 211–222 (2004).
105. Petrosino, A. J., Svihla, V. & Kapur, M. Calculating expertise in bioengineering education. in 9th Int. Conf.
Eng. Educ. (2006).
106. Fisher, F. T. & Peterson, P. L. A tool to measure adaptive expertise in biomedical engineering students. in
Proc. 2001 Am. Soc. Eng. Educ. Annu. Conf. Albuq. Nm (2001).
107. McKenna, A. F. An investigation of adaptive expertise and transfer of design process knowledge. J. Mech.
Des. 129, 730–734 (2007).
108. Mina, M., Omidvar, I. & Knott, K. Learning to think critically to solve engineering problems: revisiting John
Dewey’s ideas for evaluating the engineering education [CDROM]. Retrieved January 5, 2004 (2003).
109. Ahern, A., O’Connor, T., McRuairc, G., McNamara, M. & O’Donnell, D. Critical thinking in the university
curriculum – the impact on engineering education. Eur. J. Eng. Educ. 37, 125–132 (2012).
110. Andrews, T. & Patil, R. Information literacy for first-year students: an embedded curriculum approach. Eur. J.
Eng. Educ. 32, 253–259 (2007).
111. Prince, M. J. & Felder, R. M. Inductive teaching and learning methods: Definitions, comparisons, and research
bases. Journal of Engineering Education 95, 123-138 (2006).
112. Cloete, A. Solving problems or problem solving: What are we teaching our students? in Proc. 2001 Am. Soc.
Eng. Educ. Annu. Conf. (2001).
113. Chang, P.-F. & Wang, D.-C. Cultivating engineering ethics and critical thinking: a systematic and cross-
cultural education approach using problem-based learning. Eur. J. Eng. Educ. 36, 377–390 (2011).
114. Lunt, B. M. & Helps, C. R. G. Problem Solving in Engineering Technology: Creativity, estimation and critical
thinking are essential skills. in 108 Th Asee Annu. Conf. 8037–8044 (2001).
115. Ceylan, T. & Lee, W. L.‘Critical Thinking And Engineering Education’. in Sect. Conf. Am. Soc. Eng. Educ.
(2003).
116. Morrison, F. A. Drawing the connections: Engineering Science and Engineering Practice. (2004). at
<http://www.chem.mtu.edu/~fmorriso/advising/CEE_submitted_May2004_FAM_long.pdf>
117. Yadav, A., Shaver, G. M. & Meckl, P. Lessons learned: Implementing the case teaching method in a
mechanical engineering course. J. Eng. Educ. 99, 55–69 (2010).
118. High, K. & Damron, R. Are freshman engineering students able to think and write critically. in Asee Annu.
Conf. Expo. Conf. Proc. 12p (2007).
119. Catalano, G. D. Developing an Environmentally Friendly Engineering Ethic: A Course for Undergraduate
Engineering Students. J. Eng. Educ. 82, 27–33 (1993).
120. Hager, P., Sleet, R. & Kaye, M. The relation between critical thinking abilities and student study strategies.
High. Educ. Res. Dev. 13, 179–188 (1994).
121. Fleming, J., Garcia, N. & Morning, C. The critical thinking skills of minority engineering students: An
exploratory study. J. Negro Educ. 437–453 (1995).
122. Vogt, C. M. Faculty as a Critical Juncture in Student Retention and Performance in Engineering Programs. J.
Eng. Educ. 97, 27–36 (2008).
123. Michael, K. You talkin’ to me? Ieee Technol. Soc. Mag. 31, 5–U4 (2012).
124. Socha, D., Razmov, V. & Davis, E. Teaching reflective skills in an engineering course. in Proc. 2003 Asee
Conf. (2003).
125. Blockley, D. I. Engineering from reflective practice. Res. Eng. Des. 4, 13–22 (1992).
126. Rittel, H. W. & Webber, M. M. Dilemmas in a general theory of planning. Policy Sci. 4, 155–169 (1973).
127. Pierce, C. et al. ‘Assessment of Environments for Fostering Effective Critical Thinking (EFFECTS) on a First-
Year Civil Engineering course. in Asee Annu. Conf. Expo. Ac 1341, 1–10 (2009).
128. Claris, L. & Riley, D. Situation critical: critical theory and critical thinking in engineering education. Eng.
Stud. 4, 101–120 (2012).
129. Khisty, C. J. & Khisty, L. L. Reflection in problem solving and design. J. Prof. Issues Eng. Educ. Pr. 118,
234–239 (1992).
130. Tschimmel, K. in Des. Creat. 2010 223–230 (Springer, 2011).
131. Daly, S. R., Adams, R. S. & Bodner, G. M. What Does it Mean to Design? A Qualitative Investigation of
Design Professionals’ Experiences. J. Engine 101, 187–219 (2012).
132. Choulier, D., Picard, F. & Weite, P.-A. Reflective practice in a pluri-disciplinary innovative design course.
Eur. J. Eng. Educ. 32, 115–124 (2007).
133. Berndt, A. & Paterson, C. Global Engineering, Humanitarian Case Studies, and Pedagogies of Transformation.
in Transform. Eng. Educ. Creat. Interdiscip. Ski. Complex Glob. Environ. 2010 Ieee 1–19 (2010).
134. Chadha, D. A curriculum model for transferable skills development. Eng. Educ. 1, 19–24 (2006).
135. Edwards, G. & Thomas, G. Can reflective practice be taught? Educ. Stud. 36, 403–414 (2010).
136. Williams, J. M. The engineering portfolio: Communication, reflection, and student learning outcomes
assessment. Int. J. Eng. Educ. 18, 199–207 (2002).
137. Fordyce, D. The Development of Systems Thinking in Engineering Education: an interdisciplinary model. Eur.
J. Eng. Educ. 13, 283–292 (1988).
138. Berland, L., McKenna, W. & Peacock, S. B. Understanding Students’ Perceptions on the Utility of Engineering
Notebooks. Adv. Eng. Educ. 3, 1–21 (2012).
139. Turns, J. Learning essays and the reflective learner: supporting assessment in engineering design education. in
Front. Educ. Conf. 1997 27th Annu. Conf. Learn. Era Change Proc. 2, 681–688 (1997).
140. Brodie, L. Reflective writing by distance education students in an engineering problem based learning course.
Australas. J. Eng. Educ. 13, 31–40 (2007).
141. Chan, C. K. Y. Exploring an experiential learning project through Kolb’s Learning Theory using a qualitative
research method. Eur. J. Eng. Educ. 37, 405–415 (2012).
142. Doel, S. Fostering student reflection during engineering internships. Asia-Pac. J. Coop. Educ. 10, 163–177
(2009).
143. Svarovsky, G. N. & Shaffer, D. W. Design meetings and design notebooks as tools for reflection in the
engineering design course. in Front. Educ. Conf. 36th Annu. 7–12 (2006).
144. Svarovsky, G. N. Exploring Complex Engineering Learning Over Time with Epistemic Network Analysis. J.
Pre-Coll. Eng. Educ. Res. J-Peer 1, 4 (2011).
145. Davis, R. E. Intentional, Integrated Learning for Engineering Students. in Transform. Eng. Educ. Creat.
Interdiscip. Ski. Complex Glob. Environ. 2010 Ieee 1–14 (2010).
146. Krogstie, B. R. & Divitini, M. Shared Timeline and Individual Experience: Supporting Retrospective
Reflection in Student Software Engineering Teams. in 85–92 (IEEE, 2009). doi:10.1109/CSEET.2009.20
147. Ratkić, A. Dialogue seminars as a tool in post graduate education. Ai Soc. 23, 99–109 (2007).
148. Garcia-Perez, A. & Ayres, R. Modelling research: a collaborative approach to helping PhD students develop
higher-level research skills. Eur. J. Eng. Educ. 37, 297–306 (2012).
149. Winkelmann, C. & Hacker, W. in Des. Comput. Cogn. 603–618 (Springer, 2006).
150. Nash, P. & Shaffer, D. W. Player-mentor interactions in an epistemic game: A preliminary analysis. in Proc.
8th Int. Conf. Int. Conf. Learn. Sci.-Vol. 3 245–252 (2008).
151. Nulty, A. & Shaffer, D. W. Digital Zoo: The effects of mentoring on young engineers. in Proc. 8th Int. Conf.
Int. Conf. Learn. Sci.-Vol. 3 245–252 (2008).
152. Prudhomme, G., Boujut, J. F. & Brissaud, D. Toward reflective practice in engineering design education. Int. J.
Eng. Educ. 19, 328–337 (2003).
153. Hatfield, D. & Shaffer, D. W. Reflection in professional play. in Proc. 8th Int. Conf. Int. Conf. Learn. Sci.-Vol.
3 245–252 (2008).
154. De Graaf, E. & Kolmos, A. Characteristics of problem-based learning. Int. J. Eng. Educ. 19, 657–662 (2003).
155. Vos, H. & de Graaff, E. Developing metacognition: a basis for active learning. Eur. J. Eng. Educ. 29, 543–548
(2004).
156. Gómez Puente, S. M., van Eijck, M. & Jochems, W. Towards characterising design-based learning in
engineering education: a review of the literature. Eur. J. Eng. Educ. 36, 137–149 (2011).
157. Ropers-huilman, B., Carwile, L. & Lima, M. Service-learning in engineering: a valuable pedagogy for meeting
learning objectives. Eur. J. Eng. Educ. 30, 155–165 (2005).
158. Silvovsky, L. A., DeRego Jr, F. R., Jamieson, L. H. & Oakes, W. C. Developing the reflection component in
the EPICS model of engineering service learning. in Front. Educ. 2003 Fie 2003 33rd Annu. 3, S1B–14
(2003).
159. Edwards, S. H. Using software testing to move students from trial-and-error to reflection-in-action. in Acm
Sigcse Bull. 36, 26–30 (2004).
160. Sonntag, M. Reflexive pedagogy in the apprenticeship in design. Eur. J. Eng. Educ. 31, 109–117 (2006).
161. Heap, N. W., Kear, K. L. & Bissell, C. C. An overview of ICT-based assessment for engineering education.
Eur. J. Eng. Educ. 29, 241–250 (2004).
162. Van Hattum-Janssen, N., Pacheco, J. A. & Vasconcelos, R. M. The accuracy of student grading in first-year
engineering courses. Eur. J. Eng. Educ. 29, 291–298 (2004).
163. Willey, K. & Gardner, A. Investigating the capacity of self and peer assessment activities to engage students
and promote learning. Eur. J. Eng. Educ. 35, 429–443 (2010).
164. Chiu, J. L. & Linn, M. C. Knowledge integration and WISE engineering. J. Pre-Coll. Eng. Educ. Res. J-Peer
1, 2 (2011).
165. Tomayko, J. E. & Hazzan, O. A. Human Aspects of Software Engineering. (Charles River Media, 2004).
166. Wiebe, E. N., Branoff, T. J. & Shreve, M. A. Online Resource Utilization in a Hybrid Course in Engineering
Graphics. Adv. Eng. Educ. 2, (2011).
167. Kavanagh, L. & Cokley, J. A learning collaboration between Engineering and Journalism undergraduate
students prompts interdisciplinary behavior. Adv. Eng. Educ. 2, 1–22 (2011).
168. Green, G. Redefining engineering education: the reflective practice of product design engineering. Int. J. Eng.
Educ. 17, 3–9 (2001).
169. Bransford, J. Preparing people for rapidly changing environments. J. Eng. Educ. 96, 1–3 (2007).
170. Brophy, S., Hodge, L. & Bransford, J. Work in progress - adaptive expertise: beyond apply academic
knowledge. in Front. Educ. 2004 Fie 2004 34th Annu. S1B/28 – S1B/30 Vol. 3 (2004).
171. Rayne, K. et al. The development of adaptive expertise in biomedical engineering ethics. J. Eng. Educ. 95,
165–173 (2006).
172. Mylopoulos, M. & Regehr, G. How student models of expertise and innovation impact the development of
adaptive expertise in medicine. Med. Educ. 43, 127–132 (2009).
173. Kozlowski, S. W. Training and developing adaptive teams: Theory, principles, and research. 115-154
(American Psychological Association, 1998).
174. Smith, E. M., Ford, J. K., & Kozlowski, S. W. Building adaptive expertise: Implications for training design
strategies. (American Psychological Association, 1997).
175. Svihla, V. How differences in interactions affect learning and development of design expertise in the context of
biomedical engineering design. (2009). at <http://repositories.lib.utexas.edu/handle/2152/17411>
176. Martin, T., Rivale, S. D. & Diller, K. R. Comparison of student learning in challenge-based and traditional
instruction in biomedical engineering. Ann. Biomed. Eng. 35, 1312–1323 (2007).
177. Neeley Jr, W. L. Adaptive design expertise: a theory of design thinking and innovation. (2007). at
<http://www-cdr.stanford.edu/CDR/Dissertations/Leifer/Neeley_Diss_20070619_SUBMITTED.pdf>.
178. Alexiou, K., Zamenopoulos, T., & Gilbert, S. in Design Computing and Cognition, 489-504 (Springer, 2011)