teaching conceptual modelling: how can i...
TRANSCRIPT
Teaching Conceptual Modelling: How Can I Improve?
Daria Bogdanova, Estefanía Serral, Monique Snoeck
LIRIS, FEB@KULeuven
1
Tutorial plan
1. Conceptual modelling tasks and learning outcomesby Daria Bogdanova (PhD researcher)
2. Feedback dimensions & Practical experiences in CM by Dr. Estefanía Serral (PostDoc researcher)
3. Conclusion by Prof. Dr. Monique Snoeck
2
CM Tasks and Learning Outcomes
Daria Bogdanova, Estefanía Serral, Monique Snoeck
LIRIS, FEB@KULeuven
3
Learning outcomes in CM Each learning outcome can be categorized by the following parameters:
• Knowledge level• Cognitive level• Content area (which can be defined by means of related keywords)
Example:
LO1: The student should be able to interpret a given UML diagram (translate from a model to a text)
Knowledge level: Conceptual Cognitive process: UnderstandingContent area: Models
4
Content areas and scaffolding levels
Adapted from Bogdanova & Snoeck, 2017
5
ClassObject Attribute
Class level
Inheritance
Relationships level: generalization
Binary Association
Recursive Association
Aggregation
N-aryAssociation
Association Class
Relationships level: association
Simple Model
Model level: simple
Complex Model
Model level: complex
Pattern
GENERAL
KNOWLEDGE
Revised Bloom’s Taxonomy
Create
Evaluate
Analyze
Apply
Understand
Remember Define, Recall, Identify
Discuss, Explain, Match
Use, Practice, Execute
Examine, Analyze, Compare
Check, Verify, Critique
Design, Build, Improve
Cognitive dimension
Adapted from Krathwohl, 2002
6
Revised Bloom’s Taxonomy
Create
Evaluate
Analyze
Apply
Understand
Knowledge dimension
Remember
Adapted from Krathwohl, 2002
7
Knowledge levels
Factual Terminology, notation and basic elements
Revised Bloom’s Taxonomy
Sources of information on modelling
(textbooks, standards and thematic websites)Modelling notation(s)
General modelling conventions – how to name
classes, attributes, associations, etc. Terms and their definitions
according to the literature
8
Knowledge levels
Terminology, notation and basic elements
Conceptual Interrelationships between elements, principles and theories
Revised Bloom’s Taxonomy
Conceptual differences
between various terms
Classification of information
Fundamental principles of conceptual modelling
Commonly used modelling patterns
9
Knowledge levels
Terminology, notation and basic elements
Interrelationships between elements, principles and theories
Procedural How to do it: Subject-specific techniques and skills
10Revised Bloom’s Taxonomy
Step-by-step guidelines or
algorithms for performing a
modelling task at any stage
Criteria for use of a method or technique when
solving a particular kind of modelling task
Knowledge levels
Terminology, notation and basic elements
Interrelationships between elements, principles and theories
How to do it: Subject-specific techniques and skills
Metacognitive Awareness about your own knowledge and learning/thinking strategies
Revised Bloom’s Taxonomy
Knowledge of cognition as such
Strategic knowledge for learning the
subjectSelf-knowledge
11
Revised Bloom’s TaxonomyCognitive levels
Define, Recall, IdentifyRemember
Matching the terms with definitions/recognizing the terms
Giving definitions of terms
Drawing a graphical notation element corresponding to the given
term
Duplicating a model, pattern, or notation element
Listing types of certain concepts/terms (e.g. all types of associations)
Examples:
12
Revised Bloom’s TaxonomyCognitive levels
Discuss, Explain, MatchUnderstand
Examples:
Exemplification – giving examples representing learned modelling terms.
Linking concepts with corresponding modelling notation elements.
Explaining a given modelling concept with own words.
Translation tasks – from notation to text.
Summarizing tasks – e.g. summarizing the requirements document/a case in own words
Discussion tasks.
13
Revised Bloom’s TaxonomyCognitive levels
Use, Practice, ExecuteApply
Examples:
Implementation of a given pattern to solve a modelling task Solving a task using the given precise guidelines
14
Revised Bloom’s TaxonomyCognitive levels
Examine, Analyze, CompareAnalyse
Examples:
Ordering tasks Determining completeness or incompleteness of a statement/element/attribute Distinguishing relevant information from irrelevant Explaining the modelling choice (e.g. modelling something as a class or as an attribute) Determining which modelling pattern is used in a given model
15
Revised Bloom’s TaxonomyCognitive levels
16
Check, Verify, CritiqueEvaluate
Example:
Checking the correctness/finding mistakes in used notation Identifying conventions/rules violations Critiquing the choice of example/definition
Revised Bloom’s TaxonomyCognitive levels
Design, Build, ImproveCreate
Examples:
Develop a solution for a given simple modelling task Generate a class/association/group of classes for given requirements Designing a model element
17
Cognitive Process Dimension
Knowledge Dimension
Remember Understand Apply Analyse Evaluate Create
Factual
Conceptual
Procedural
Metacognitive
Where would you position the following learning activity:
"Find a syntactic mistake in the use of UML notation in a given model"
18
19
Cognitive Process Dimension
Knowledge Dimension
Remember Understand Apply Analyse Evaluate Create
Factual
Conceptual
Procedural
Metacognitive
Where would you position the following learning activity:
“Make a list of candidate classes from the given text according to the class elicitation
guidelines”
Current state: what do we teach?
Bogdanova & Snoeck, 2017
“Remember” and “Evaluate” are heavily underrepresentedMetacognitive knowledge level is not represented at all, factual – almost not represented
20
Current state: what do we teach?
Bogdanova & Snoeck, 2017
The majority of tasks address model and relationships levels; MOOCs “like” simple models, while books and exams “like” complex ones
21
Current state: what do we teach?
UnderstandCreate
Conceptual
RememberEvaluateAnalyze
FactualMetacognitive
Apply
Procedural
Result: uneven scaffolding; lack of constructive approach
VS
22
Feedback dimensions & Examples in CM
Daria Bogdanova, Estefanía Serral, Monique Snoeck
LIRIS, FEB@KULeuven
23
Agenda
• Feedback Dimensions
• Feedback Examples for CM
24
Feedback Dimensions
25
Feedback Dimensions: Overview
Design
Purpose
Level
Nature
Domain
-Summative-Formative-Motivational - Informative-Corrective-Suggestive-Elaborative
-Task-Process-Self-regulation-Self
-Positive-Negative
Delivery
Timing
Format
-Beginning of a task- Immediate-Delayed-End of a task
Context
RecipientInformation
Environmentaland
Timing Conditions
LearningTask
EducationalSetting
- Individual-Group
-Simple-Complex-…
Sender- Professor- Peer- Tool
Usage & Impact
26
27
Design: Purpose28
Formative Summativeevaluates the learner and presents up-to-date success or failure information
guidelines to improve while the task is still being performed
Design: Purpose 29
Informativedescriptive, comparative, or evaluative. Elaborate picture of where a learner stands.
Motivationalpositive or negative reinforcement
Verificationassessment of the answer (and correct response)
Suggestiveadvice on how to proceed. Invites learner to explore an idea
Elaborativeaddresses the answer and errors and provides explanations
Design: Level
• Task Level: how well tasks are understood or performed
• Process Level: which process needs to be executed to understand or perform tasks
• Self-regulation level: self-monitoring, directing, and regulating of actions
• Self level: personal evaluations and affect (usually positive) about the learner
30
(Hattie and Timperley, 2007)
Design: Nature
Positive Negative
• reinforces existing behaviour
• useful when the student is uncertain but nevertheless has given a good solution
• just affirmative messages • or more elaborate ->
provide additional benefits
• May enhance motivation, but in excess may cause students to feel bored
• provides information for improvement
• encourages learners to correct a specific behaviour or consider their strategies
• Should contain also guidance or corrective feedback to reach a good solution
• May be accepted as a challenge, but in excess may cause anxiety and frustration
31
Design: Domain Knowledge
• Learning objects: tasks, assessment exercises, etc.
• Goals
• Evaluation criteria
• Competences/skills
• Methodologies and processes
• …
32
33
Delivery: Timing
• Immediate, just-in-time, or in-action: just when the problem arises• Delayed: a while after a task has been performed or a mistake has been
made• Anytime: available anytime during task execution or a whole course and
provided on demand • At the end of a learning task/process (also called About-Action): enables
learner to monitor their progress and adapt accordingly in subsequent tasks
• Referred to the stage in the learning process: e.g., early stages (novice learners) vs late stages (advance learners)
34
Delivery: Format
• Written
• Video
• Audio
• Chat dialogue
• Blogs
• Shown in a external window, embedded in the learning environment interface, etc.
• …
35
36
Context: Recipient/s
• Essential to consider for personalization
• Directed to:
37
Context: Recipient/s
• Previous and current interactions with the system (aka tasks, activities, or actions)
• Previous assessed material: scores and answers
• Affective state: e.g., engaged, confused, frustrated, distracted, bored…
• Learners’ characteristics: self-esteem/confidence, commitment to work, culture, prior knowledge, learning style, etc.
• Location
• Learners’ baseline characteristics
38
Usage & Impact
39
Usage
• Common feedback usages from learners:• check if their learning is on track• identify gaps in their knowledge or understanding • address difficult concepts or problems• adjust and focus the material to study• keep motivated or increase their confidence• reflect on what they have learned
Final goal: know how to improve their learning!
40
Impact/Effect
• Important to understand how learners use the received feedback to maximize its impact
• Retro-feedback
41
Feedback Examples for Conceptual Modelling
42
Examples of Artefacts for Feedback• Online tests (Automated !)
• small exercises per topic• individual feedback
• Homework/Assignments• small cases per topic• individual feedback + group feedback
• Self-study exercises / lab exercises• exercises archived from previous home works + lab exercises• model solutions + commented student solutions• individual feedback + group feedback
• Group work• larger, integrated case• Peer, group/individual feedback
• Automated feedback in simulation environment• Individual immediate/on-demand
43
Online tests
Level
Purpose
Timing
Nature
Presentation
Domain Knowledge
Recipient Usage/Effect
FEEDBACK
Immediate/ On demand
Online, Textual +/-Graphics
Task Level
Summative/ Verification/Elaborative
Skills
Pos/ Neg/
Neutral
Individual Check if learning is on trackSender
Automated
44
Online Test Feedback Example 45
Online Test Feedback Example 46
Online Test Feedback Example
Create
Evaluate
Analyze
Apply
Understand
Remember
47
Homework or Assignments with Solutions
Level
Purpose
Timing
Nature
Presentation
Domain Knowledge
RecipientUsage/Effect
FEEDBACK
Delayed On paper + in class
Task Level
Verification/ Elaborative
Skills / Goals / Evaluation Criteria
Pos/Neg
Individual/ Group
- Check if learning is on track- Address difficult concepts or
problemsSender
Professor
48
Homework Individual Feedback Example49
Homework Group Feedback Example: student solution
An oil company possesses a number of gas stations spread across the country. Each gas station has a number of pumps that each contains a particular type of gasoline. Customers can pay their refuel turn by using their fuel card…
50
If Refuel Turn is made existence dependent from Gas-station, how do we know at which pump the refuel took place ?
Homework Group Feedback Example: Model solution 51
PUMP
STATION
CARD HOLDER
refuels at >
part of has
0..*
0..*
0..*
1
Optional because when the customer is registered, (s)he exists for the company but has not yet taken a refuel turn.
Obviously a pump can be in at most one station at the time.
But does a pump have to belong to a Gas Station in order to exist ? Can it be moved from one station to another ? (= open questions, to ask to the user)
INVOICE
0..*
0..* 0..*
1
receives sent to
Refuel turns taken at different Gas Stations can be on the same invoice.
Optional because a pump can exist without a customer having taken a refuel turn at this pump already. Hopefully the pump will get used some day. But maybe, if it is in a bad location for example, it will never be used ...
Homework Group Feedback Example: Model solution 52
PUMP
STATION
CARD HOLDER
refuels at >
part of has
0..*
0..*
0..*
1
INVOICE
0..*
0..* 0..*
1
receives sent to
Create
Evaluate
Analyze
Apply
Understand
Remember
Group Work Peer Review Feedback
Level
Purpose
Timing
Nature
Presentation
Domain Knowledge
Recipient
Sender
Usage/Effect
FEEDBACK
Delayed On paper
Task Level
Informative
Skills + Specific case
Pos/Neg
Identify gaps in their knowledge or understanding
Individual / Group
Peer
53
Group Work Peer Review Feedback: Example54
Group Work Peer Review Feedback: Example55
Group Work Peer Review Feedback: Example56
Group Work Peer Review Feedback: Example57
“Order” is not required to create an instance
of an invoice
3
Create
Evaluate
Analyze
Apply
Understand
Remember
Automated Feedback from LE
Level
Purpose
Timing
Nature
Presentation
Domain Knowledge
Recipient Usage/Effect
FEEDBACK
Immediate/ On demand
Online, Textual +/-Graphics
Task Level
Verification,Elaborative
Skills
Neutral
Individual - Check if learning is on track- Identify gaps in understanding- Keep motivated
Sender
Automated
58
Automated Feedback Example59
Automated Feedback Example60
2
“Order” is not required to create an instance
of an invoice
3
Create
Evaluate
Analyze
Apply
Understand
Remember
Automated Feedback Example61
“To buy wholesale products customers need to place an order. However ordering is not required for buying a retail product.”
2
“Order” is not required to create an instance
of an invoice
3
1
Automated Feedback Example62
“To buy wholesale products customers need to place an order. However ordering is not required for buying a retail product.”
2
“Order” is not required to create an instance
of an invoice
3
1
Create
Evaluate
Analyze
Apply
Understand
Remember
CONCLUSIONS
63
General conclusion
• Assessment and Feedback are underdeveloped• Assessment not complete
• Feedback very limited
64
Recommendations• Consider all dimensions of feedback & Vary !
• Automate the simple forms of feedback• Use a diversity of technology
• PPT, blogs, ... elaborative feedback• Online tests • Dedicated didactic software
• Try to log & analyse student behaviour• Switch tracking on in Toledo / dedicated software• Objective:
• Understand behaviour & improve feedback (retro-feedback)• Collect real-time information for personalisation of feedback
• Consider creating didactic software
65
References
• Krathwohl, David R. "A revision of Bloom's taxonomy: An overview." Theory into practice 41.4 (2002), pp. 212-218.
• Daria Bogdanova, Monique Snoeck. Domain Modelling in Bloom: DecipheringHow We Teach It. The Practice of Enterptise Modelling (2017), pp.3-17.
• Estefanía Serral, Monique Snoeck, Jan Elen. A Framework for Conceptualizing the Domain of Automated Feedback for Learners. Review of Educational Research. Under review.
• Estefanía Serral, Monique Snoeck. Conceptual Framework for Feedback Automation in SLEs. 20th International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES-SEEL2016). Tenerife, Spain. 2016.
• Estefanía Serral, Jochen De Weerdt, Gayane Sedrakyan, Monique Snoeck. Automating Immediate and Personalized Feedback. Taking Conceptual ModellingEducation to a Next Level. International Conference on Research Challenges in Information Science (RCIS’16). Grenoble, France, 2016.
66