Lack of Data on the Effectiveness of Online Education in K-12

After reviewing the report “Evaluation of Evidence-Based Practices in Online Learning” (U.S. Department of Education, 2010), I was struck by two major themes:

1. Good pedagogy is good pedagogy, regardless of the medium.

2. There is not a magic formula that can be followed to create an effective online classroom.

The good news is: online instruction has been shown to be an effective educational medium.   The bad news for the U.S. Department of Education is that this report provides little in the way of guiding specific practices in online education in the K-12 arena, mostly due to the lack of available data for this population.

The meta-analysis continued on despite the lack of K-12 data, including studies that were predominantly based upon undergraduate and graduate-level college students, as well as adult learners receiving professional training.   The authors divided the studies into three “learner type” groups”: K-12, undergraduate, and other, and concluded that age of the learner did not have a significant impact on the effects found.   This is one area that I found lacking in the study: I do not think that findings specific to adult learners who are receiving professional training are translatable to K-12.       I would have liked to have seen the undergraduate student category further stratified to look at studies of college freshman, who would comprise a better comparison, at least for high school students.   Even so, the population of students who are attending college and graduate school are not necessarily representative of the population of students in K-12.

As I read through the report, I found myself more interested in reading the summaries of the individual studies than the overall compilation of data.   Some of the outcomes of the report were quite expected: equivalence of curriculum and instruction played a major role in the level of effect seen (it’s hard to compare apples to oranges); online elements added to the curriculum mostly improved the learning experience, adding prompts for students to reflect on their learning improved learning outcomes.   But when measuring effects, it is important to remember that studies were only included if there were objective measurements of student learning, which indicates content knowledge measured through some type of testing rather than student perceptions of learning or deep understanding of concepts.   Narrowing the scope of the studies included is a necessary step in conducting a meta-analysis such as this, but it is important to keep in mind how learning was measured through “testable” outcomes but not other important aspects of learning.   As an example, while addition of an online discussion forum was not shown to improve measurable learning outcomes, that experience may have provided the student with valuable insight and better prepared them for future discussion forums, both online or face-to-face.

I was intrigued by the comment on page 2 regarding the motivation to improve the learning experiences of students through online education, which reads “Another conjecture is that asynchronous discourse is inherently self-reflective and therefore more conducive to deep learning than is synchronous discourse”.   I certainly appreciate having the time to reflect on what I would like to say, and to express those thoughts in writing.   This is an area I’ll explore in the literature.

Overall, I believe the authors did a reasonable job with the very limited data they had available.   A couple of general best practices gleamed from the report are to include prompts for student reflection, and to allow students some control of the manipulation of the online material.   More broadly, there are no definite strategies an instructional designer should “always do” or even “always avoid” based upon this report.     The authors themselves do not assert that this study provides substantial guidance for online education in K-12, but rather solicit additional studies at the K-12 level.   This is vitally important as the number of students in K-12 taking online courses will continue to increase.

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development,  Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies, Washington, D.C., 2010.

 

 

 

 

4 thoughts on “Lack of Data on the Effectiveness of Online Education in K-12

  1. Owen

    Hi Lori,

    Thanks for sharing your thoughts. As I read through your piece, this line jumped out at me, ““Another conjecture is that asynchronous discourse is inherently self-reflective and therefore more conducive to deep learning than is synchronous discourse”… and I couldn’t help but think of comments on the Newsminer, CNN, YouTube, etc… I’m not sure asynchronous discourse is inherently self-reflective… In a best case scenario, perhaps or when implemented well, we can hope for that result.

    Reply
  2. Bob

    “Another conjecture is that asynchronous discourse is inherently self-reflective and therefore more conducive to deep learning than is synchronous discourse”.

    As you all saw last night giving me a microphone is not a pretty thing. I do better when writing and so asynchronous discourse suits me. I mentioned in a previous post a course with Thom Easton. In that course we had weekly synchronous meetings. All class discussion was done through the chat interface. Some goofy stuff got said, but, in truth a lot of thoughtful stuff as well. I liked it because you had to think quickly, and type aphorisms, but it took several classes for me to figure out how to do it. In that case the filter of chat forced some greater quality into some of our synchronous discourse. So, I want us to be careful and not make too simplistic a thesis, too artificial a divide. Some of the success in the synchronous moments of that class had to do with the risks the teacher was willing to take. As any good teacher Thom had plan d, plan e, and so on but that chat thing could have gone south. I remember him keeping multiple conversations going as he herded cats. Once I saw what he was doing I started facilitating too helping to pull threads back together and to tease out connections between different contributors. Surprisingly successful but risky too. I suspect it would be even easier now with people texting nearly all the time — the mechanical skill is rote so one could focus on deepening the conversation. Nah, microphones are easier.

    Reply
  3. Jenny

    Lori,
    I agree with your finding from the reading that “There is not a magic formula that can be followed to create an effective online classroom.” I was hoping to have more of the “best practices” outlined to help me create more effective online courses, but it appears that there is too little research at the K-12 level to draw any meaningful inferences. I like your suggestion that studies using college freshman be selected to get a better comparison for at least secondary students. I am shocked by the lack of research on the effectiveness of online learning in the K-12 environment with the current push and presence of online classes found there.

    Reply
  4. https://hdw.eweb4.com

    First off I would like to sayy awesome blog! I had a quick question which I’d
    like to ask if you don’t mind. I wwas curious to know how you center
    yourself and clear your head before writing. I have had a tough time clearing my thoughts in getting my ideas out there.
    I truly do take pleasure in writing but it just seems like the first
    10 to 15 minutes are usuwlly wasted just trying to figure
    out how to begin. Any recommendations or hints? Cheers!

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *