Author Archives: lsowa

Increasing Engineering Knowledge Through a Photo-Journal Project

Article Review 5 – Lori Sowa

Purdue University is one of the few schools in the country with a PhD-granting School of Engineering Education, with faculty research focusing on graduate, undergraduate, and P-12 engineering education.   The Institute for P-12 Engineering Research and Learning (INSPIRE) was established in 2006 at Purdue, in part to promote engineering learning in the elementary classroom.   Each summer, INSPIRE hosts engineering academies for local elementary teachers.

Duncan et al. (2011) used Bloom’s revised taxonomy as a theoretical framework to evaluate teachers’ ability to recognize and understand engineering in the world around them through a photo-journal project.   Cameras were sent to elementary teacher participants prior to the start of the summer INSPIRE academy with instructions to take ten photographs related to engineering. For each photograph, teachers were instructed to record the date, time, and location, and to explain how each scene related to engineering.   Then, after the first of a five day workshop, the teachers were given further instructions to take ten additional photographs and to record the same information.   The additional ten photographs/journal entries were to be completed by the end of the workshop.

The photographs and associated journal entries were collected and categorized as either pre- or post- workshop.   Each journal entry was coded using Bloom’s revised taxonomy to determine the cognitive level of the entry.   Since Bloom’s taxonomy is hierarchical, the levels were given a numeric indicator, with 1 being “Remember” and 6 being “Create”.   Due to the nature of the exercise, none of the journal entries showed evidence of “Create”, so the highest level achieved was “Evaluate”. The authors used statistical methods to determine whether the teachers demonstrated an increase in cognitive level post-workshop, which they did (equivalent to one cognitive level).

This study was rigorous and well-conducted.   Extensive measures were taken to establish inter-rater reliability, including preliminary analysis of journal entries that would not be included in the study, refinement of methods, and then analysis on the study group (K-fourth grade teachers).   Limitations noted by the authors included: small sample size (n=40) from one geographic area; the timeframe for the photographs was not actually pre- and post-workshop, but rather pre- and during- the workshop; and using ordinal values assigned to Bloom’s taxonomy for analysis typical of Likert-type scales was a novel, untested approach.   In addition, the retention of this knowledge of engineering, and whether the understanding transfers to students in the teachers’ classrooms, has not been studied.

The underlying assumption in the hierarchical nature of Bloom’s taxonomy is that higher level equals better learning, perhaps since learning through this model is considered cumulative.   I find the assigning of numerical values to artifacts coded for each cognitive level to be a useful technique that could be applied to many situations to facilitate quantitative analysis of the data.   I’m curious to dig into this technique and see if and how other researchers are using it, as the authors state it is a novel approach.

I like the integration of journal-ling as a learning activity because it brings the important writing aspect into a STEM activity.   The learning objective targeted by this activity was to “convey a broad perspective of the nature and practice of engineering”.   This type of activity could easily be facilitated (and perhaps enhanced) in an online environment through posting to a blog.   Looking forward to my Engineering for Educators course, this activity could be adapted to help achieve my third learning objective (to understand the engineer’s role in society, and inspire a desire in students to use engineering to solve problems that matter to people).

This article was valuable to me for a number of reasons, not the least of which is a reminder that I need to brush up and expand on my readily-available statistics knowledge.   I’ve taken a number of statistics courses throughout the years, but none recently and it shows as I try to follow along with the statistical methods used in this and other recently reviewed papers.   Google was my link to definitions and explanations.

This article provides insight into the effectiveness of a summer academy on increasing teachers’ ability to recognize and understand engineering in the world around us – an important step in preparation of teachers who will teach or even just discuss engineering in their classrooms.   However, it is important to bear in mind the teachers in this study were provided a substantial amount of professional development centered on engineering (Monday through Friday, 7:30 AM – 5:00 PM with additional homework including reading and assignments).   In Secondary Level Engineering Professional Development: Content, Pedagogy, and Challenges, Daugherty and Custer (2012) describe a number of barriers to successful implementation of engineering in K12 classrooms, including: teachers’ lack of mathematical skills needed to implement engineering activities, lack of background in engineering to maintain fidelity of the curriculum, the amount of time required for lesson planning, student mathematical background and motivation, resources, and institutional barriers.     Certainly these barriers are not insurmountable in all cases, but they must be considered and point to the multifaceted and nontrivial nature of this issue.

Daugherty, J. L., & Custer, R. L. (2012). Secondary level engineering professional development: Content, pedagogy, and challenges. International journal of technology and design education, 22(1), 51-64.
Duncan, D., Diefes-Dux, H., & Gentry, M. (2011). Professional Development Through Engineering Academies: An Examination of Elementary Teachers’ Recognition and Understanding of Engineering. Journal Of Engineering Education, 100(3), 520-539.

Learning Objectives for an Engineering for Educators Course

Weekly Writing 6 – Lori Sowa

I am finding the Fink text (2013) to be an excellent guide for developing the Engineering for Educators course.   Many times I have lamented the fact that I simply don’t have enough time to holistically plan my courses before I step into the classroom, and to include innovative and engaging activities during each class period.   In reality, I end up teaching a course much as I had learned it the first time, and then each year making adjustments, adding projects, changing assessments – improving the course incrementally over time.   As I gain years of experience and confidence with new teaching methods, the changes occur more quickly.   The opportunity (challenge?) of looking at this course in-depth prior to teaching it for the first time is refreshing.

The value of the taxonomies of learning for me are the validation of the importance of the “soft skills” (which really are the most important skills) and moving beyond traditional content.     Learning how to learn, appreciating the human element, and the importance of that human element, were not explicitly discussed in any of my courses throughout college.   The engineering curriculum is extremely content heavy, and not much time is spent on anything else.   Successful students figured it out how to “get through it” on their own (perhaps most often through role models and expectations at home), the others did not.

Reading through the three taxonomies, I most easily identified with Fink’s taxonomy for significant learning. I appreciate the fact that it is not hierarchical, but instead inter-related.   While there is certainly an order to learning complex content material, and solid foundational knowledge is required to build up to sophisticated engineering design, the other aspects – human dimension and caring, for example, can provide the motivation to put in the hard work required to understand and master the hard math and science.

I don’t expect my students to become expert engineers after completing this one course.   I think that not identifying as an engineer may provide significant hesitation for educators without an academic background in engineering who are asked to teach engineering in their classrooms.   I certainly would have a hard time if someone expected me to inspire my students to be anthropologists, or political scientists – as I don’t see myself in this role, nor do I have direct experience in these roles.   What I hope to accomplish in my course, then, is to provide teachers with the tools so they can work towards becoming experts at recognizing engineering applications and implementing meaningful engineering projects in their classrooms.   Perhaps the key is to allow them flexibility to choose a topic they are familiar with, interested in, and have a level of confidence in the content to adapt to an engineering project for use in their classroom for their first assignment.

I used PowerPoint to create a concept map for my course based on Fink’s Taxonomy of Significant Learning (p. 34-37).

Concept Map for EfE course

The taxonomy was quite useful in outlining important concepts and learning goals for the course.   One aspect of learning that I hope to achieve in my students (and eventually, their students) that I did not find a place for, though, were developing characteristics such as grit, resilience, not being afraid of failure, focus, and developing a good work ethic.   The would most closely align with learning how to learn.

Based upon this concept map, I have developed the following three learning objectives for my lesson plan for this course:

1. Identify and understand the components of the engineering design cycle

2. Use the engineering design cycle to create active learning opportunities in their classrooms that are age-appropriate, engaging, linked to content knowledge, and that address state and national standards

3. Understand the engineer’s role in society, and inspire a desire in students to use engineering to solve problems that matter to people.

These are lofty but worthwhile objectives that I would love to accomplish in my course.

Fink, L. D. (2013).  Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass.


Situational Factors for an Engineering for Educators Course

Weekly Writing 5 – Lori Sowa

My curriculum unit will be developed for a course titled “Engineering for Educators’ which will be delivered fully online. The audience will be in-service K-12 teachers pursuing a Master’s Degree in STEM Education, but may also include pre-service teachers. I would expect that many of the teachers in the course would currently teach some level of math, science, or technology — but the backgrounds will likely be quite varied. The goal will be to provide an overview of engineering that will allow teachers to use authentic engineering problems in their classrooms, and to be able to adapt the methods to their particular age group and setting.

The course will be delivered via Blackboard through the Department of Education at UAS. The M.Ed. degree will be offered fully online, thus the course itself must be delivered fully online. Herein lies my greatest challenge — engineering is by nature a hands-on activity, and so this mode of delivery is an experiment in itself. Class size will likely be 10-20 students (teachers) from across Alaska. The teacher’s classroom settings will likely be widely varied – from rural villages to “urban’ centers, kindergarten through high school. Teachers will have a varied background in science and mathematics, which will also pose a challenge for developing course materials. Most of the participants will be working full time and taking courses at night, which can make extended synchronous sessions difficult. Based upon my experience taking courses while working full time and raising a family, my approach will be to use an asynchronous format with regularly scheduled synchronous sessions as needed. My background is in engineering, and while I’ve studied pedagogy and am pursuing a PhD in Engineering Education, I do not have teaching experience at the K-12 level. Therefore, I expect the learning experience in the classroom will be a two-way street (as always). It is also possible that I will co-teach the course with an Education faculty member.

The subject matter for the course will start with a general discussion of engineering — perceptions and misconceptions by the general population and students. Motivation for including engineering in the curriculum will be discussed, as well as a review of current literature on the topic. Then, I’d like the teachers to experience the engineering design cycle themselves, and be exposed to the variety of resources that exist for engineering in K-12. The course will include content background and practice for areas critical to solving real world problems, including methods of estimation, assumptions, units and critical thinking skills. Teachers will then be supported in the development of a number of engineering projects relevant to their own classrooms.

To be honest, I’m not sure what society’s expectation would be for an Engineering for Educators course. The engineering community would likely expect the course to contain rigorous applications of math and science in a real world context with design applications included. The Department of Education expects the course to provide the necessary background for teachers to use engineering in the classroom to meet national standards (although Alaska has not yet adopted the next Generation Science Standards).

I believe the most challenging aspect of the course will be the sheer variety of backgrounds, the inability to perform hands-on activities in a face-to-face format, and the lack of experience and comfort level of the teachers with engineering. Engineering is one of those fields that, unless you have a relative or close acquaintance in the field, you may not know much about. Unless you were an engineering major at one point, you’ve also not likely taken an engineering course before. However, seeing the results will (hopefully) be quite rewarding. I co-taught a STEM course for K-8 teachers last spring, and was really impressed with the engineering projects the teachers facilitated in their own classrooms. The subject matter can provide excitement — a new way to engage students not just in math and science, but in real world problems that face all of society.

Effective Engineering Pedagogy for Upper Elementary Teachers

Article Review 4 – Lori Sowa

We expect quite a bit from our K-12 teachers – and with the increased focus on STEM education, we are now expecting teachers to incorporate engineering into their classrooms.   While all teachers have taken science and math classes during their high school and college years, it is safe to say the vast majority of teachers have never formally studied engineering (unless they are recovering engineers who decided to make teaching their career).   Quality professional development opportunities must be provided if we expect meaningful engineering experiences to find their way into K-12.

In a recent paper, a team of engineering education researchers (Guzey, et al., 2014) provide a qualitative analysis of the results of a year-long professional development workshop on   integrating engineering content for third through sixth grade science teachers. The results are measured by analyzing posters documenting each teacher’s implementation of engineering activities in their own classrooms using a framework that identifies the required components of a successful engineering activity.   The professional development served 198 teachers and included 30 hours of face-to-face workshops (spread out in one or two-day increments over the course of an academic year) along with 16 hours of time spent within professional learning communities developed within their school districts to reinforce what they had learned and share ideas.   One the final day of the workshop, teachers presented their own implementation of engineering design projects in their classrooms through posters.

The framework the researchers used defined the necessary components of an engineering curriculum unit.   According to the framework, the engineering unit should:

  1. have a meaningful purpose and engaging context;
  2. have learners participate in an engineering design challenge for a compelling purpose that involves problem solving skills and ties to context;
  3. allow learners to learn more from failure and then have the opportunity to redesign;
  4. include appropriate science and/or mathematical content;
  5. teach content with student-centered pedagogies; and
  6. promote communication skills and teamwork.   (p. 141)

The teachers’ posters were electronically captured and then coded for each of the elements above.   The authors note that all of the posters included evidence of items 5 and 6 above (student-centered pedagogy and teamwork), thus these elements weren’t included in the coding.   Although not highlighted in the paper, I feel this is a substantial accomplishment in itself.   Forty-seven percent of the posters were found to incorporate all of the criteria above, with the remaining projects categorized based upon the missing components.

The use of the specific framework is somewhat limiting (I feel that successful classroom experiences can be produced that do not meet each goal above), but I think the framework does a reasonable job of outlining the elements of a robust engineering experience.   It also gives teachers a framework to measure their own lesson plans, and provides a pathway for improvement.   I will use the framework myself to re-evaulate the project assignments I give in my own classroom.

Once again, we see the focus on “learning from failure”.   Failure is a somewhat peculiar way to characterize this phenomenon, as many of the projects likely met the goals the first time (so not really a failure), but then students were provided an opportunity for improvement which allowed an overall improved design.   But I have seen true “failure” in my classroom with design projects.   I use a scaffolded approach in my freshman engineering course, starting with small design projects, leading up to a large, final design project.   In the second of three projects, I actually assign points for how well their design functions.   The project is to create a device that will extinguish a candle in exactly 20 seconds, and I take off points for each second outside of this range.   I always feel a little uncomfortable with this grading scheme, but there are other aspects to their project grade (final report, etc).   However, the level of effort usually goes up when their grade depends on the outcome.   I felt particularly bad with a recent group of students who, despite their good effort, failed to produce a design that met the criteria.   I allowed them an extra week (with a minor points penalty) to come back with an improved design, and was pleasantly surprised when they came back with an excellent design that met the criteria exactly.   I learned quite a bit from this exercise myself.

The focus on integrating engineering through science education, rather than as yet another, separate topic, is a valuable approach in a number of ways.   In recent conversations I’ve had about STEM education in K-12, a local school principal indicated that she thought that STEM activities were great – but were most likely to be implemented as after-school activities rather than full-class activities.   I have to disagree with this method of incorporation.   After-school activities can provide quality education experiences to those who choose to (or have the means to) participate, but full class activities reach all students.     Engineering can be a vehicle to apply math and science skills, but can also be used in a non-quantitative way, incorporating social and political aspects of technology and problem-solving.   Helping teachers incorporate it in this way is a challenge but is certainly not impossible.

Overall I believe this article highlights a successful model for professional development aimed at providing guidance for upper elementary teachers to include engineering in the curriculum, and uses a well-defined qualitative approach to measure the success of implementation.   However, the authors do not address what I believe is one of the most important aspects of this project.   Were the teachers given this feedback on their own projects?   The teachers are “designing” engineering opportunities for their students – where is their opportunity to learn from “failure”, improve their lesson design and try it again?   I’m sure much of this will happen on its own as teachers are constantly improving their teaching through experience, making adjustments, and trying it again next year.   But the teachers themselves could benefit from having some specific feedback using this framework. By publishing the research, it is shared with the research community, but I would like to see this aspect built in as a major component of the professional development.   It would be quite interesting to follow up on these particular teachers through a longitudinal study, looking at improvements in the implementation of engineering design projects over time.   This study also begs for follow up from both students and teachers in terms of measuring student learning gains, student attitude and motivational factors, and teachers’ satisfaction in implementing the projects, and their perceptions of the success or “failures” in the classroom.

Guzey, S., Tank, K., Hui-Hui, W., Roehrig, G., & Moore, T. (2014). A High-Quality Professional Development for Teachers of Grades 3-6 for Implementing Engineering into Classrooms. School Science & Mathematics, 114(3), 139-149.

Increased Conceptual Understanding Through Peer Instruction in Engineering Dynamics

Article Review 3 – Lori Sowa

Contemplating 21st Century skills and gaming theory through the videos and writings of John Seely Brown, I jotted down a number of “big ideas” to research: embracing change, learning to join, power of play, not having a defined endpoint, tinkering, demand-based learning.   I searched through a number of articles on the benefits of video games, passion, obsession, and even addiction – but kept returning to the idea of peer instruction, which I think is central in Brown’s theories. My dissertation research is forming around a number of faculty who are using a flipped classroom approach to STEM courses at the freshman and sophomore undergraduate level.   The main question becomes – how can we best structure the in-class activities to promote deep learning?   The goal is to have students learning from each other, but how can we structure the class to promote this?

In Teaching Engineering Dynamics by use of Peer Instruction Supported by an Audience Response System, Schmidt (2011) describes a study where he implements Mazur’s peer instruction (PI) method in two engineering dynamics courses at a University in Denmark.   A third course, taught using the same methods but without the PI discussion questions, served as a control.   The author (who was also the instructor for all three courses) used a number of exams (a pre-test of engineering knowledge, final exams, and the cohort’s mathematics exam scores) and a class survey at the end of the course to look at learning gains and students’ dispositions related to the teaching style.   The questions on the final examination were broken into two categories: traditional problem solving and conceptual understanding.   The author found that scores on the traditional problem solving portion did not vary significantly among the groups, but that the two classes that used PI scored better on the conceptual questions.

The study was overall well-conducted, as the author has a reasonable control group and made an effort to tease out the level of preparedness of the students in the study.   However, one variable that was not controlled for was the language in which the course was taught.   The program of study was highly international, so two of the sections (the control and one of the experimental groups) were taught in English, while the third course was taught in Danish.   The latter group scored better all around, which could potentially be due to the course being taught in their native tongue. In addition, I would have liked to have seen some more detailed questions on the student survey at the end of the course.   For example, one of the questions read “Give an assessment of your own preparation for classes”.   Since the author encouraged students to read ahead in the text, and their doing so would have likely influenced their performance in the PI activities, a more specific question such as “I prepared for class by reading the assigned sections” and using a Likert-scale rating system to indicate always, sometimes, rarely, etc. may have provided better data.   One of the common issues with flipped classrooms is the students’ lack of preparation before class, so it would be nice to quantify this (to the extent you can actually rely on this data).

In my mind the most impressive result of the study was the increase in the number of correct responses after PI discussions and before instructor intervention.   Figure 1 from Schmidt’s article (p. 418) shows the percentage of correct answers increased in almost all cases from the students’ initial response (the x-axis below, before PI) to their response after discussion with peers (y-axis, after PI), many times quite substantially.

Scmidt PI

(apologies – the figure is much more clear in the original text)

Another important aspect of student learning using PI is the awareness by students that they do, indeed, make mistakes and have conceptual misunderstandings. In regards to the students who engaged in PI rating their own understanding of the material lower relative to the control group (who had a higher level of confidence in their grasp of the subject matter), the author provides this explanation:

It is believed that the discrepancy between the students’ assessment of their own outcome and the examination score is related to the quality of the clicker method to expose misunderstandings among students.   By taking part in PI-teaching, the student faces the fact that he/she makes quite a lot of mistakes when interpreting new methods and ideas.   Thus, the student gets the impression that the knowledge gained is not as profound as the student receiving traditional lectures feels regarding his or her outcome: at a traditional lecture it is tempting for the student to be fully satisfied with all the lecturer’s nice explanations! (p. 421)

There is an outpouring of research that shows that making mistakes and experiencing failure are truly important in the learning process.   Many times, students are afraid of failure, and this can inhibit their ability to learn.   Schmidt (2011) also states that “…the goal was a safe study environment where the student had no reason to fear giving a wrong answer… [in] this way, it is believed that the most ‘honest’ answers and the best measure of the students’ conceptual understanding as possible were obtained” (p. 416). In searching for an article to reference the importance of not being afraid of failure, I came across this powerful TED talk video with a focus on peer instruction and learning from mistakes.

Once again, the idea of students (novices?) learning from other students proves beneficial to the overall learning process.   While this instructional method was used in a face-to-face scenario, the method could be adapted to an online medium.

Schmidt, B. (2011). Teaching engineering dynamics by use of peer instruction supported by an audience response system. European Journal Of Engineering Education, 36(5), 413-423.

Peer Instruction: In Person and in Virtual Worlds

Weekly Writing 4 – Lori Sowa

I enjoyed watching a snippet of Eric Mazur’s lecture using Peer Instruction in his physics course on YouTube. The topic he is teaching is a complicated one involving the “right hand rule”, where students must think about various vector quantities – electrical force, magnetic field, electric field – to try to answer the question posed.   This is a particularly difficult concept to learn from a textbook, as you are dealing with quantities in 3 dimensions.   Even though the students in the class were not able to come to consensus on a right answer (if there was one), what I liked about how this approach was used is that each student had substantial time to explore the issue independently and in small groups, so when the instructor finally discussed the question with the group I believe they were more likely to understand the answer and see where their own thinking was correct and where it was flawed.   I would venture to guess that most physics students leaving most traditional lectures of this particular material leave the classroom without a good understanding of the topic.

I learned about a very similar technique, called “think-pair-share”, in a College Teaching course I took years ago, and like to use it often.   I like the structure of this approach for a number of reasons:

1.   It allows (forces?) students to think through the problem on their own first.

2. Discussing the answer with a partner provides an opportunity to further explore the topic in a low-risk environment, and to see others’ perspectives and thought processes on the issue.

3. Having adequate time to reflect on a topic, and then practicing explaining, defending, and possibly refining or changing their theories, can help a student find the confidence to share aloud to the entire class.   There is substantial scaffolding built in to the approach, from both comfort and theory-development perspective.

This technique can take a passive lecture and turn it into an active learning opportunity where students interact with the material, forming associations and constructing knowledge.

Back in the threaded discussion regarding online laboratories,   I described the benefits of a particular online simulation lab I used where students could actually change the value for acceleration due to gravity, effectively moving themselves from the Earth to the moon to Jupiter. I found this to be an effective method to help students differentiate between mass and weight.   In the comments, Bob posed a great question about peer instruction – how would this be accomplished in online labs?   My experience in this case was in a blended setting, where students used the online simulations in class with other students and an instructor present – so collaboration with peers was possible.   But how would we accomplish this completely online? I’ve used BlackBoard Collaborate to facilitate small group work with students – but would it be possible for both students to access the same simulation in real-time? Ideally students would be able to talk via microphone and/or chat, while each being able to access and manipulate the simulation together from separate locations.   I know that you can “share your desktop”, but I’m not sure if this has the needed functionality.   Here is a situation where a technological tool is needed to facilitate the desired pedagogy.   If anyone has insight on how this can be accomplished, I would love to hear about it.

One way that students can collaborate in the same online space in real time is in Minecraft – in either the standard or EDU version that includes extra functionality that is useful for an educational setting.   Last Spring, I co-taught a course on STEM education where we used Minecraft as a modeling platform.   Looking back at the experience now, I realize that we actually did facilitate synchronous, online “laboratory” work during class.   A colleague set up a server that all of our teachers could access, so even though we were spread throughout the state, we could all be present in the same “world” in Minecraft, performing physics experiments and admiring student creations.   This platform surely pushes the boundaries of what we traditional, skeptical instructors we would consider an “academic laboratory”.   Like any pedagogical tool implemented for the first time, there were successes and frustrating failures.   But, many of our teachers were able to foster collaborative student work in Minecraft that was truly educational and engaging. Perhaps with the more formal method of peer instruction in mind, we could create learning experiences that would foster peer instruction in a virtual environment similar to those created by Eric Mazur in a face-to-face environment.

I lament the fact that I do not have a screenshot to share of all of our teacher-students and instructors flying around in Minecraft, timing pig races and measuring lengths to explore the concepts of distance, rate, and time, among other fun and (mostly) successful tasks.   But I will share a few screenshots that I do have: one of a scale model hydroelectric facility that a couple of 4th graders built, one of the pig race arena I mentioned earlier, and one of the four instructors for the course in Minecraft.

minecraft hydro





Those interested in learning more about MinecraftEDU can start here:




Getting the Most out of Asynchronous Discussion Groups: A Focus on Critical Thinking

Article Review 2 – Lori Sowa

When adequately facilitated, asynchronous discourse can be an effective learning tool that is unique to the blended or online classroom. In my brief experience with both taking and co-teaching online courses, I’ve experienced a few variations on how the discourse can be structured. Writing prompts have been a helpful means to focus the discussion, and quantifying the number of expected responses provides direction to the participants as to how much collaboration is expected. Intrigued by a comment from the recently reviewed meta-analysis regarding the inherent advantage of this type of discourse, I decided to dig further into the topic of how to best structure and facilitate online discussion groups.

In Tagging Thinking Types in Asynchronous Discussion Groups: Effects on Critical Thinking, Schellens et al. (2009) studied the effect of requiring students to use specific scripts when they post to a discussion group to describe their underlying thought process. The study included a very small sample size: 35 students from a junior-level undergraduate class on instructional strategies. The class was randomly divided into 6 groups — 4 experimental and 2 control — and required to participate in a discussion group debating different perspectives, possibilities, and limitations of e-learning. Participation in the discussion group was a formal part of the students’ grade, and they were required to post at least five messages over a two-week period. The assignment was identical for each group, except that the experimental groups were required to tag each post using “thinking hats’ adapted from those developed by De Bono (1991). As an example, the description of one of the six hats in Schellen’s (2009) article reads:

The blue hat is the color of the sky high above us. This hat stands for a reflective perspective to see whether the right topic is addressed. What is relevant? Defining what to think about and deciding what is to be reached. (p. 81)

At the conclusion of the two-week period, the messages were coded based upon Newman et al.’s scheme (1995) which identified ten critical thinking categories. The authors’ found evidence of critical thinking in both groups, but significantly more positive indicators (and less negative indicators) of critical thinking in the experimental group using the thinking hats.

Overall, the experimental design was rigorous and grounded in sound theoretical context. Coding of the individual posts was performed by two individuals, with interrater reliability tested and found to be reasonable. The sample size was quite small, and thus repeated experiments involving more students over longer discussion time frames would provide more representative results. It would also be interesting to survey the participants about their perceptions of the value of the discussion group, and perform longitudinal studies to see if this method improves the critical thinking of these students in future discussions.

It is difficult to infer the context and wording of the assignment from the text of the article, and what specific guidance was given to the class regarding the expectations of the content of the posts. But just by providing the description of the thinking hats, and the instructions to use the full range of hats, the experimental group was provided with additional instruction and guidance compared to the control group, leading them towards aspects of critical thinking that would then be counted using the model. Requiring that a tag be used each time likely cut down or eliminated irrelevant posts in the experimental group.   This is a good thing overall for meeting the goal of the assignment, but also likely skewed the outcome in favor of the experimental group. Perhaps a better measure of the effect of this specific tagging scheme would have been to discuss the idea of critical thinking in general with the entire class, but then to require only the experimental group to use the thinking hat tags for posts.

Any pedagogical tool used in any classroom must take the audience and the intended learning outcomes into account during the course design phase. The level of direction from the instructor to the students is one of these aspects. Finding the balance between being overly-specific and vague in assignments can be tricky. The results from this study are promising in terms of providing a system that effectively scaffolds students to be intentionally critical in their thinking when posting to online discussion groups. I can see using this or a similar method, particularly for students new to discussion groups, but even for more advanced students.

De Bono, E. (1991). Six thinking hats for schools, Resource book for adult educators. Logan, IA: USA Perfection Learning.

Newman, D.R., Webb, B., and C. Cochrane. (1996). A content analysis method to measure critical thinking in face-to-face and computer supported learning group learning. Interpersonal Computing and Technology, 3, 56-77.

Schellens, T., Van Keer, H., De Wever, B., and M. Valcke. (2009). Tagging thinking types in asynchronous discussion groups: effects on critical thinking. Interactive Learning Environments 17(1), 77-94.

Lack of Data on the Effectiveness of Online Education in K-12

After reviewing the report “Evaluation of Evidence-Based Practices in Online Learning” (U.S. Department of Education, 2010), I was struck by two major themes:

1. Good pedagogy is good pedagogy, regardless of the medium.

2. There is not a magic formula that can be followed to create an effective online classroom.

The good news is: online instruction has been shown to be an effective educational medium.   The bad news for the U.S. Department of Education is that this report provides little in the way of guiding specific practices in online education in the K-12 arena, mostly due to the lack of available data for this population.

The meta-analysis continued on despite the lack of K-12 data, including studies that were predominantly based upon undergraduate and graduate-level college students, as well as adult learners receiving professional training.   The authors divided the studies into three “learner type” groups”: K-12, undergraduate, and other, and concluded that age of the learner did not have a significant impact on the effects found.   This is one area that I found lacking in the study: I do not think that findings specific to adult learners who are receiving professional training are translatable to K-12.       I would have liked to have seen the undergraduate student category further stratified to look at studies of college freshman, who would comprise a better comparison, at least for high school students.   Even so, the population of students who are attending college and graduate school are not necessarily representative of the population of students in K-12.

As I read through the report, I found myself more interested in reading the summaries of the individual studies than the overall compilation of data.   Some of the outcomes of the report were quite expected: equivalence of curriculum and instruction played a major role in the level of effect seen (it’s hard to compare apples to oranges); online elements added to the curriculum mostly improved the learning experience, adding prompts for students to reflect on their learning improved learning outcomes.   But when measuring effects, it is important to remember that studies were only included if there were objective measurements of student learning, which indicates content knowledge measured through some type of testing rather than student perceptions of learning or deep understanding of concepts.   Narrowing the scope of the studies included is a necessary step in conducting a meta-analysis such as this, but it is important to keep in mind how learning was measured through “testable” outcomes but not other important aspects of learning.   As an example, while addition of an online discussion forum was not shown to improve measurable learning outcomes, that experience may have provided the student with valuable insight and better prepared them for future discussion forums, both online or face-to-face.

I was intrigued by the comment on page 2 regarding the motivation to improve the learning experiences of students through online education, which reads “Another conjecture is that asynchronous discourse is inherently self-reflective and therefore more conducive to deep learning than is synchronous discourse”.   I certainly appreciate having the time to reflect on what I would like to say, and to express those thoughts in writing.   This is an area I’ll explore in the literature.

Overall, I believe the authors did a reasonable job with the very limited data they had available.   A couple of general best practices gleamed from the report are to include prompts for student reflection, and to allow students some control of the manipulation of the online material.   More broadly, there are no definite strategies an instructional designer should “always do” or even “always avoid” based upon this report.     The authors themselves do not assert that this study provides substantial guidance for online education in K-12, but rather solicit additional studies at the K-12 level.   This is vitally important as the number of students in K-12 taking online courses will continue to increase.

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development,  Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies, Washington, D.C., 2010.





Providing Personalized Emotional and Motivational Support in Online Remedial Math Courses

Article Review 1 – Lori Sowa

While researching articles related to active learning strategies for online courses, I came across an article on personalized learning environments for students taking online remedial math courses.   This article captured my interest because I am teaching a remedial math course for the first time this semester.   I know the content of the course well, but helping students to surmount significant challenges to learning the material, working through math anxiety, and being engaged when many don’t see a direct connection to their educational goal is a challenge.

In The Role of Affective and Motivational Factors in Designing Personalized Learning Environments, Kim (2012) outlines guidelines for developing Virtual Change Agents (VCAs) to accommodate students’ emotional and motivational needs specific to online remedial math courses.        VCAs are described as human-like animations that are designed to facilitate positive changes in learners’ attitudes and are personalized to meet the needs of each individual student.   The VCAs can provide personalized motivation by allowing the student to choose the subject matter of example problems (e.g. motorcycle repair, dieting calculations, etc.) — thus improving student perceptions of task value and controllability, which ultimately can lead to students’ positive reappraisal of their situation.   The author also suggests that student interactions with VCAs can be designed to promote emotional regulation skills, and provides a framework on how to accomplish this based upon prior research in affective and motivational factors.

This is the first I’ve heard of VCAs, and to be honest my first reaction is doubt about whether learners will relate to an animated human providing pre-programmed advice.   The effectiveness of this specific strategy remains to be seen — the author acknowledges the theories and guidance described in the paper have yet to be validated, and is actively seeking researchers to perform these studies.   But the author does point to a number of studies related to “Computers Are Social Actors’ (CASA) theory that support the idea that users relate to computers as they do to people. The study of computer-person interactions is fascinating to me, and is certainly an area that is quickly evolving and expanding.  It is unclear, however, if any of the studies referenced support a sustained relationship through purely computerized interaction that leads to a substantial change in emotional and motivational state for populations relevant to this study.   I can certainly envision the possibility, though, and in fact it is a bit mind-boggling to think about the potential applications.

The underlying strategy used in creating the personalized learning environment is to provide support to overcome both emotional and motivational barriers to learning math content. The author quotes a relevant analogy from Buck (1985) who states that ‘‘just as energy is a potential that manifests itself in matter, motivation is a potential that manifests itself in emotion. Thus motivation and emotion are seen to be two sides of the same coin, two aspects of the same process’’.   The strategies outlined to work through these barriers would apply to any classroom, whether online or face-to-face.   However, the online classroom may allow for greater personalization when instructor time is limited. For example, I try to choose relevant example problems in my lecture based upon what I know about my students’ educational goals and interests.   But I will never be able to do this for each student.   Allowing students to have control over the context of the problems aligns with the constructivist theory of learning, while fostering motivation is an essential tenant of cognitive psychology.

The research addresses an important need — as the author reports, 1 in 3 students entering the University are placed into a remedial math course.        These students bring a wealth of issues that require personalized scaffolding to address and promote student success.   To intervene and provide that personalized support in an online environment, instructors and instructional designers have to find ways of predicting when and why students will disengage, and provide built-in solutions to try to prevent this. It will be interesting to see how effective VCAs can be in providing this support system.

Buck, R. (1985). An integrated view of motivation and emotion. Psychological Review, 92(3), 389—413.

Kim, C. (2012). The Role of  Affective and Motivational Factors in Designing Personalized Learning Environments.  Educational Technology Research and Development 60(4), 563—584.