Monthly Archives: September 2014

Impact of Online Learning on Performance Gaps-AR2

I reviewed the article by DiXu and Jagers (2014) titled “Performance Gaps Between Online and Face-to-Face Courses: Differences Across Types of Students and Academic Subject Areas.’ I was interested in this article after reading the USDOE (2010) meta-analysis of online learning, which presented the idea that differences in content could have significant results in learner outcomes in an online environment. This article looks specifically at the interaction between differences in content and also addresses the differences between learner types and their success in this environment. As an educator working in a district with a pronounced achievement gap between the majority and minority population, I was particularly interested in finding out if online education can help reduce this gap or if it might in fact be increasing the gap.

Di Xu & Jaggars (2014) conducted a large scale (500,000 courses taken by over 40,000 students) analysis of content and learner type comparisons from community and technical colleges in Washington State. Students were tracked from the fall of 2004 to Spring of 2009 for course persistence and grade performance. The authors chose course persistence and grade as fundamental measures of success for community college students reasoning that students who “withdraw from a course mid-semester run the very real risk of never returning to successfully complete the course, thereby prohibiting progression to the next course in the sequence’(Di Xu & Jaggars 2014).  Results from this study, showed an “online performance gap’ across the board, indicating that all types of students performed more poorly in online courses than they did in face-to-face courses, implying that online instruction is more difficult for the average student. This gap was especially pronounced for “males, younger students, Black students, and students with a lower prior gpa’ (Di Xu & Jaggars 2014). Even more worrisome was the finding that in courses where student subgroups differed in terms of their face-to-face course outcomes, those same differences tended to be exacerbated in online courses.  The study also found noticeable gaps between subject areas taught. Online courses in the following subject areas demonstrated significant online performance gaps: the social sciences (e.g. anthropology, philosophy, and psychology) and the applied professions (business, law, and nursing). The authors proposed that these subject areas may require a high degree of hands-on demonstration and practice or intensive student—instructor interactions and student—student discussions, which may be more difficult to effectively implement in the online context (Di Xu & Jaggars 2014).  I wonder if some of these gaps would have narrowed if a blended approach would have been utilized.

I was very impressed by the way the authors conducted their analysis. I did not see many flaws to the design of this study, other than it could be expanded to other states, types or educational institutions and perhaps younger age groups.  With the multitude of possible confounding factors that could be influencing a dataset such as this one, the authors made a concerted effort to control for many of those factors. The authors controlled for differences amongst courses within a particular subject and variation in instructional quality and support. They also built in robustness checks to  address effects that might be a result of whether the student had previously taken an online course, if they were employed while they were taking the course and how many hours they may have been working while taking the course. This was especially important given that many of the students in this study were considered “non-traditional’ students (e.g. over 25 and balancing work, family and their education).

This study has some serious implications for the way we use online education in the future. In my own case, as an educator working in a district that is actively trying to address a pronounced achievement gap between the largely caucasian majority and its minority Latino student population, I am concerned what the results of this study indicate. If the pattern observed applies to K-12 students, it implies as the author suggests “that the continued expansion of online learning could strengthen, rather than ameliorate, educational inequity’. Working in a district with a one-to-one initiative, can we point to this new use of technology as a means to widening the gap or closing it. I would like to think that there are steps we can take to better these programs, rather than assuming they will broaden the gap. The authors suggest we take at least three approaches to improving online performance: screening, early warning, and scaffolding. For screening, schools could limit or eliminate the supply of online sections for course subjects where students do poorly. Scaffolding could also be increased, by incorporating the teaching of self-directed learning skills into courses. This idea has the greatest potential, within secondary schools and I would propose that it become part of the curriculum at this level, so that students would find more success at the college level. As the authors point out “these skills may not only help close the online performance gap, but may also improve students’ overall performance and long-term persistence in college’(Di Xu & Jaggars 2014).

Works Cited

Di Xu, & Jaggars, S. S. (2014). Performance Gaps Between Online and Face-to-Face Courses: Differences Across Types of Students and Academic Subject Areas. Journal of Higher Education, 85(5), 633—659.


Implications for K-12 Online and Blended Classroom Environments-ww3

The USDOE (2010) meta analysis was well conducted, as a means of comparing online to blended learning conditions for undergraduate and graduate students. I felt that the authors were rigorous in selecting studies that included control groups, though this limited the analysis to a relatively low number of studies (n=45) for a meta-analysis.  As the authors suggest, the meta analysis is less effective for K-12 students, because of the lack of useable studies for this age group. In light of the lack of evidence, we might look at the findings of this study more as possible suggestions rather than conclusive findings. With that being said, there are certainly some valuable suggestions to be found in the report, that I will implement when designing my own courses. The article itself, is a good reminder that we need more research into this topic and that research should be well designed and rigorous if it is to be used for making policy decisions.

I was surprised by many of the findings in this paper, and wonder if these findings would be different for K-12 populations. One of these findings was that “Students in online conditions performed modestly better, on average, than those learning the same material through traditional face-to-face instruction’(USDOE 2010). I expected that there may be no difference found between traditional and online classrooms, but not better performance in an online situation. This finding appears contrary to other research that suggest it is the instructional design, not the mode of instruction that increases learning (Bonk and Reynolds 1997). The paper was unclear on why this difference may have been found.  The authors researched specific activities that may enhance online learning environments, but the content and presentation varies between experiments which probably influenced the results. I was also surprised that “the addition of images, graphics, audio, video or some combination’ did not affect learning outcomes in a significant way. The presentation of material using different modes is often suggested to improve processing of information (Ally 2008).

While the effectiveness of the practices studied in this report were inconclusive, I think that this report made several important suggestions for best practices that I would try to incorporate in my own classroom. Overall, it appears that the activities that produced positive results increased student control, interactivity or metacognition. These include increased learner control in media, incorporation of metacognitive activities, individualizing instruction and the inclusion of elaborated questioning.These findings seem to support constructivist theory that the learner should be active rather than passive in the learning process. According to the report, passive media such as videos and static graphics had no significant impact on learning, while learner controlled media did have a positive effect on learning. It is perhaps the interactivity, rather than the media that produced positive results. Though the report found only two studies of the effects of individualizing instruction both found a positive effect (USDOE 2010).  Adaptive instruction caters to each student’s needs and encourages greater activity. Not unexpectedly, the results of three studies exploring the effects of including different types of online simulations were also modestly positive. Once again supporting constructivist theory and the role of the active learner. One of the best supported finding in this report found that metacognitive “tools or features prompting students to reflect on their learning’ were effective in improving outcomes and suggests that promoting “self-reflection, self-regulation and self-monitoring leads to more positive online learning outcomes’ (USDOE 2010). Equally as interesting, is the finding that graphic organizers and concept maps, as well as embedded quizzes did not have any positive effects on learning. These techniques are often suggested as ways to improve metacognition  in online courses. Many of the online courses I have taken rely heavily on these activities, and I have also used them in my own classes. The findings of this report have caused me to rethink my use of these tools.

The conclusion I reached from reading this article, was that more studies need to be conducted, particularly from the K-12 environment. There is a large push in secondary education to produce online and blended learning environments and while this may offer students an academic advantage, there’s virtually no research to suggest so. With the amount of attention that is being placed on online and blended learning, I am surprised at how little statistically significant evidence there is to recommend it.

Works Cited

Ally, M. (2008). Foundations of educational theory for online learning. In The theory and practice of online learning (2nd ed., pp. 15—44). Athabasca, AB Canada: Athabasca University.

Bonk, C. J., & Reynolds, T. H. (1997). Learner-centered web instruction for higher-order thinking, teamwork, and apprenticeship. In B.H. Khan’s (Ed.), Web-based instruction (pp. 167—178). Englewood Cliffs, NJ: Educational Technology Publications.

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.(USDOE) (2010). Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies. Washington, D.C.


Weekly Reading #3, Bob Heath

In my article review this week I ended up at a place calling for a blended approach to learning. In my comment celebrating augmented reality I ended up at a place calling for blended learning. Accordingly, I gravitated to the “Blended Compared With Pure Online Learning’ section in:

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development,  Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies, Washington, D.C., 2010.

Sadly, and I will save you many words: “no significant differences’ were found. That is, the research conducted thus far is inconclusive when comparing blended learning to purely online learning.     Therefore, I suggest we trust our guts and go with what we think is sexy and cool in a geeky sort of way.

My two reasons for going to blended presentation for some types of instruction: first, some knowledge, some interactions are best face-to-face, second, as in the EcoMobile videos going to the pond, as a group is incredibly important and not possible in an exclusively online format.     But wait these are claims not reasons and so I need to make arguments and provide evidence.

Returning to the Passing On video I shared in my second article review: one of our narrators describes appropriate ways to approach elders seeking instruction either for her students or for herself. Alas, I cannot interrogate her. The video is a gem but it is one-way transmission. If she were online, I could interrogate her, certainly, but in text I cannot capture inflection of tone, timing, and so on, nor can I capture facial expression. Likewise, in text I cannot capture the spiral logic implicit in instruction given by elders. Certainly, FaceTime gets us closer and closer, but it struggles with buffering and is limited to the quality of camera attached to the device. This harkens back to my friend John Schumacher’s criticism of e-mail and phone calls as representations of altogether different events — face-to-face co-making of inquiry. If we were 40,000 years old in online technology, I would have to consider that our physical evolution might have adapted to information/communication technology. Nevertheless, most of these changes have happened in the past 20 years, we are for the most part the same human beings who hunted mastodons with stone tipped lances. Those human beings learned from each other face-to-face. So, let us imagine that exclusive online learning is a human-made environment rather like the inside of the Apollo space capsules, sufficient but barely. Certainly amazing and cool but when we look back on our LMS systems in 20 years they will seem as harsh and spare as the interiors of the space ships we flew to the moon. Moreover as persons facilitating learning, we have additional obligations not just content. We have obligations to civil society, to appropriate public discourse, to fostering leaders. While some of this work can be done online not all of it can be. I think this is so because these are not just about the content knowledge but about making eye contact, about nodding, or gesturing, they are about situating the knowledge in a cultural moment — why it is that TedTalks are taped in front of an audience perhaps. I have already touched on some reasons for my second claim implicitly — learning is not just about making individuals it is about making cultures, creating psycho-social facilities and ensuring survival of the individual and the group — easy to forget in post-industrial society.

Therefore, I will hazard a claim that classes that toggle between content and meta-cognition would be better in a blended environment. I suggest leadership as one example, perhaps cycles of seasonal subsistence might be another topic that would be better served in a blended class. Second, I theorize that classes aimed at younger learners K-12, perhaps even 13, are better blended. I suggest blended in part because of the force multiplier, that various online tools offer, EcoMobile/EcoMuve as an example, flipped instruction as another. The next question is how do we formulate the research question to show results more conclusive then we see in the required reading. I am a philosopher not a social scientist so forgive me the speculation: I suspect one would have to create three courses in three formats with comparable outcomes and teach them adjacently for an extended time. Probably possible at a larger University that offers classroom, purely online, and blended presentation. However, while waiting on those results what kind of decision model can we create for the rest of us in the mean time? I suspect that like cell phones, laptops, Google documents, Twitter, we have to remember classroom, online, and blended are tools in our toolkit and in our professional roles part of our excellence is our facility and artistry in using the right tools at the right time — there is an element of trusting our guts.

Lack of Data on the Effectiveness of Online Education in K-12

After reviewing the report “Evaluation of Evidence-Based Practices in Online Learning” (U.S. Department of Education, 2010), I was struck by two major themes:

1. Good pedagogy is good pedagogy, regardless of the medium.

2. There is not a magic formula that can be followed to create an effective online classroom.

The good news is: online instruction has been shown to be an effective educational medium.   The bad news for the U.S. Department of Education is that this report provides little in the way of guiding specific practices in online education in the K-12 arena, mostly due to the lack of available data for this population.

The meta-analysis continued on despite the lack of K-12 data, including studies that were predominantly based upon undergraduate and graduate-level college students, as well as adult learners receiving professional training.   The authors divided the studies into three “learner type” groups”: K-12, undergraduate, and other, and concluded that age of the learner did not have a significant impact on the effects found.   This is one area that I found lacking in the study: I do not think that findings specific to adult learners who are receiving professional training are translatable to K-12.       I would have liked to have seen the undergraduate student category further stratified to look at studies of college freshman, who would comprise a better comparison, at least for high school students.   Even so, the population of students who are attending college and graduate school are not necessarily representative of the population of students in K-12.

As I read through the report, I found myself more interested in reading the summaries of the individual studies than the overall compilation of data.   Some of the outcomes of the report were quite expected: equivalence of curriculum and instruction played a major role in the level of effect seen (it’s hard to compare apples to oranges); online elements added to the curriculum mostly improved the learning experience, adding prompts for students to reflect on their learning improved learning outcomes.   But when measuring effects, it is important to remember that studies were only included if there were objective measurements of student learning, which indicates content knowledge measured through some type of testing rather than student perceptions of learning or deep understanding of concepts.   Narrowing the scope of the studies included is a necessary step in conducting a meta-analysis such as this, but it is important to keep in mind how learning was measured through “testable” outcomes but not other important aspects of learning.   As an example, while addition of an online discussion forum was not shown to improve measurable learning outcomes, that experience may have provided the student with valuable insight and better prepared them for future discussion forums, both online or face-to-face.

I was intrigued by the comment on page 2 regarding the motivation to improve the learning experiences of students through online education, which reads “Another conjecture is that asynchronous discourse is inherently self-reflective and therefore more conducive to deep learning than is synchronous discourse”.   I certainly appreciate having the time to reflect on what I would like to say, and to express those thoughts in writing.   This is an area I’ll explore in the literature.

Overall, I believe the authors did a reasonable job with the very limited data they had available.   A couple of general best practices gleamed from the report are to include prompts for student reflection, and to allow students some control of the manipulation of the online material.   More broadly, there are no definite strategies an instructional designer should “always do” or even “always avoid” based upon this report.     The authors themselves do not assert that this study provides substantial guidance for online education in K-12, but rather solicit additional studies at the K-12 level.   This is vitally important as the number of students in K-12 taking online courses will continue to increase.

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development,  Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies, Washington, D.C., 2010.





Online Labs?

While we are on the subject of inherent strengths and weaknesses of various mediums (face-to-face, blended, online), what do you all think of lab courses? I had an interesting conversation with a veteran faculty member that left  me with  the following question:

Can substantial lab-based learning experiences be designed for online courses?


Synchronous Meeting #2 – Monday Sept. 29

Greetings All,

Looks like our best bet for our next synchronous meeting is Monday, September 29th, from 5-6 PM AST.

I’ll set up the Collaborate session, it should be a good discussion and I’m looking forward to it.  Tulsi and Alda, if you can drop in for a portion – that would be great. If not, I’ll post the recording and try to schedule our next meeting at a time convenient for you.

See you then!



Second Synchronous Session

Howdy folks,

If you haven’t already responded to the scheduling poll, please do so. Here’s the link.

Looks like we have several great topics including; online education in the K-12 arena, whether or not there are unique barriers within online education, VCAs, and CASA…lots of great stuff.

Looking forward to our discussion and keep up the great work reading giving feedback on each posts.


Article 2 Review, Bob Heath

Several co-learners in this course have raised the matter of cultural diversity in online learning “environments’ — particularly in Alaska. I am intrigued with this issue and so it inspired me to some greater investigation. This review will focus on a single article:

Xiaojing Liu, S. L., Seung-hee Lee, Magjuka, Richard J. (October, 2010). Cultural Differences in Online Learning: International Student Perceptions. . Journal of Educational Technology & Society., Vol. 13 (Issue 3), p177-188.

but this is only a starting point.

The authors examine seven themes that arose out of their research: Assessment, Instruction/Interaction, Asynchronous/Synchronous Communication, Collaboration, Case Learning, Academic Conduct, and Language. Students were from the U.S., China, India, and Russia. Rather, than extensive review of the article itself, the methods and so on, I would like to focus on the findings since this is practical and immediately useful. The authors offer a table that quickly summarizes these:

seven themes

One of the themes that came out of last semester’s ED 631 — Culture , Curriculum and Community class was “both-and’ that is Native Alaskan youth needed to be able to navigate both Western ways of education and Native cultural practices. In the recommendations for assessment practices we see “Multiple assessment strategies: Structured and flexible assignment schedule’ this strikes me as a way to accomplish the Alaskan goal for “both-and’ assuming we can actually strike a balance between process-oriented vs. exam-oriented assessment for example. Turning to instruction/interaction we are encouraged to “Incorporate features that accommodate different cultural pedagogy.’ And to my mind this is the rub of exclusive online instruction, however, it might also be a place for young learners to gain esteem in the eyes of elders. As an example, working with spruce roots, I can imagine a young learner setting up their iPhone and recording a video of their work with the roots, harvesting, preparing, and finally basket making. They then use video editing software to polish their product and submit it asynchronously through the LMS, for peers and elders to watch and comment on. The Dragonfly Project out of Haines has shown how this has opened doors between youth and elders where the roles reversed and the youth taught elders computer use. It is a small stretch to imagine another youth creating a video comment refining a technique and that inspiring an elder to seek out a youth to help them add a video comment with additional improvement, or at least the Instructional Technologist at the hosting institution.

Turning to balanced use of asynchronous/synchronous communication I am forced to wonder about blended courses as perhaps most appropriate for cultural content.   As I think about online instruction and Alaskan communities and schools, the role of elders is the most perplexing. This video, Passing On worth watching in entirety but particularly at minute 7:31 poses a question that has stumped me, certainly both when I first encountered it in the 1980s as a student at Sheldon Jackson College and again this past spring — why can’t a learner simply ask for what they need? “Yo, I’m a dufus. I forgot the words to the jump rope song can ya drop me a clue?’ I suspect that as a white guy from away I may never understand the answer I hear. Perhaps the best I can achieve is sensitivity to my ignorance. However, there is something going on here that is subtle and culturally unique — I am not at all certain that it can be captured in online learning. I suspect then that online instruction must necessarily be in conjunction with face-to-face interaction — particularly in Native Alaska and particularly when focused on cultural preservation.

Addressing the fourth and fifth themes together, collaboration and case learning, I am reminded of a leadership training I attended for managers in libraries and IT in higher education this summer. The national statistics for CIOs show 97% are white males. We were fortunate to have a woman and two African American men in the room as instructors all at that level in their organizations. One of the most telling comments made by one of them was “if you want to recruit for diversity then you need to create your interview committees so that when I walk into the room I see people like me.’ I think this is at the heart of these themes. If I as a learner cannot find myself in the course content, I can barely begin to connect or construct with the material. In thinking about Alaska Natives, we often focus on the diversity, the differences, the factors of cultural uniqueness. I suspect that in areas of politics, law, economics, and health care, tribal groups across the nation share a great deal of similarity in the problems they seek to correct. This article offers a place both for cultural diversity and shared issues through case studies in online instruction. The student interviews conducted by the authors highlight how the cultural diversity enriched their thinking about both the local and global issues.

Owen, in a comment on this blog, pointed out that in teaching remedial math he felt he had to write a guide for the guide. For different reasons I suspect that is also the case in a culturally diverse online course. The authors say: “Several international students have expressed frustration at being severely punished for their inappropriate citation of others’ work according to the academic rules of the U.S. universities. They felt that the instructors lacked an understanding of the cultural differences in regard to educational practices’ (Xiaojing Liu, October, 2010). So here we see the need for both cultural sensitivity and emotional intelligence and as instructor learns from their mistakes, hopefully that increasingly involves front loading the instruction — offering guidelines for decoding educational practices rather than reactive punishment for not even understanding that a coded message was in use.

Finally, dealing with language differences, or in the case of Alaska Natives building occasions for language practice into the curriculum — the project is one of preserving languages. This is, I think, the real value of online instruction. We have an opportunity to combine the talents of content experts and instructional designers in ways that are far more rich and productive then the “solitary sage on the stage.’   I joke that English is my only language and for a non-native speaker I do ok. Therefore, in this I would need the help of a native speaker and probably an instructional designer to build language into any courses I wanted to create. However, the value is in both preserving the language and showing respect to the cultural diversity in the class. Course creation is necessarily an iterative and collaborative project.


Unit 2: Week 2 Weekly Writing

AnneMarie Mattacchione

September 22, 2014

Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies (2010) provides 50 independent effects for consideration. The meta-analysis describes each facet of the study and provides detailed metrics. I applaud the attempt to make sense of online studies measuring the effects on student learning; however, I find myself questioning the validity. For example the meta-analysis outlines in exhibit 1. Conceptual Framework for Online Learning (p. 5) the comparison of like and enhanced assignments between the differing learning modalities. It indicates that some opportunities for learning were provided in one context but not in the comparison assignment. For example: “Live, one-way webcast of online lecture course with limited learner control (e.g., students proceed through materials in set sequence)’ is compared to “Viewing webcasts to supplement in-class learning.’ It appears one set of students are instructed to proceed through the learning materials while another is provided a choice to use the materials. In my opinion, if we do not measure the same course assignments then how is it possible to have a valid understanding of the student learning outcomes?

In the Early Childhood Education department at UAF faculty have developed face to face, eCampus, and audio courses to meet the needs of various learning styles and course accessibility. We measure each course learning objectives with a mix of common assignments and teacher selected assignments. Each course has a least one common assigned which includes the same assignment instructions, rubric and student feedback format. We can measure student learning across all modalities in this one learning outcomes using the common assignment. However, I do not see how we can do the same for the teacher selected assignments, since each teacher determines what assignment is used and how it is measured. I could not compare the effectiveness of the student learning outcome with mine since we do not have a common way to measure. To me it would be like measuring apples to oranges. However, individual teachers can compare their teacher-determined assignment from year to year to understand the effectiveness of the assignment, instruction and student supports. As a whole, I am not convinced this particular meta-analysis can make a valid determination of the effectiveness of online compared to face to face courses. I suppose I need to see that we are comparing apples to apples.

However, I consider the ongoing conversation concerning the differing approaches to online learning interesting. What exactly do we mean by online learning? According to this study blended learning seems to edge-out other forms of learning online and purely face-to-face.

“In fact, the learning outcomes for students in purely online conditions and those for students in purely face-to face conditions were statistically equivalent. An important issue to keep in mind in reviewing these findings is that many studies did not attempt to equate (a) all the curriculum materials, (b) aspects of pedagogy and (c) learning time in the treatment and control conditions. Indeed, some authors asserted that it would be impossible to have done so. Hence, the observed advantage for blended learning conditions is not necessarily rooted in the media used per se and may reflect differences in content, pedagogy and learning time.’ (p. 17)

This particular statement aligns with what I believe takes the best of both the online and face to face world- combining approaches that are best for students learning styles. Some students confess during advising appointments that they do not like the online forum of learning and benefit more so from face to face contact with peers and teachers. They often cite their learning style necessitates a face to face interface for learning. Some students argue that flexibility of schedule is more important than meeting the needs of their particular learning style. They do the work online because it is most convenient for them, rather than take a modality they prefer. Then there are students that feel most comfortable taking online courses, yet still complain they feel isolated. When I added blended elements to my eCampus courses, student feedback was more positive about them not feeling so isolated.

To me, meeting the student learning outcomes is not all there is to the learning experience for students. Another wondering I had about the meta-analysis is comparison of student accomplishment. Did students learn more because they had higher grades? Did students learn more because it fit their learning style better? Will students maintain the learning? I have earned high grades in all my courses since starting college many years ago, but can I remember the course content? Which content do I remember? For me, the most memorable learning opportunities involved the interdependent relationships between me and the teacher and my peers when I was able to construct my own knowledge in a social context. Sometimes that included online learning communities, but most often in face to face settings. The point of learning is not to learn something once; it is to use the information in an application for future learning or performance. I would argue those students who are able to use their preferred learning modalities and styles are more capable of application and retainment.

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies, Washington, D.C., 2010.