Are open online education and quality online education mutually exclusive?

In the past, I’ve touched on a distinction I see in the landscape of higher education. It is this distinction that leads me to say that programs like Coursera and edX and programs like Udacity and the Georgia Tech OMS are not competitors, but rather represent two largely different goals of education: openness and quality.

Of course, I hate using the word ‘quality’ because it implies that open education cannot be high-quality, which is not what I mean to suggest. Rather, what I mean to suggest is that openness and quality often get in the way of one another. Developing open courses for a platform like Coursera almost inherently dictates that costs must be extremely limited. Offering a course through Coursera does not bring in a tremendous amount of money; even the Verified Signature track, I would speculate, barely pays for the human effort required to grade assignments and verify identities. Developing open courses can be an act of either marketing or altruism, but in either case, there is a natural impetus to keep costs low. The outcome, of course, is nonetheless fantastic: the world’s knowledge presented by the world’s experts on that knowledge in a venue that everyone can access. Even if the cost pressure demands that this information can only be presented in the traditional lecture model, the outcome is nonetheless incredibly desirable.

That openness is largely driven by the internet’s ability to deliver content to massive audiences for low costs. However, that’s not the only thing that the internet can do in service of education. The internet also has features and frameworks that can create educational experiences that go beyond what we can do in traditional classrooms. Many traditional college classes are delivered in the same lecture model as the aforementioned Coursera courses, but pedagogically we know that this model is largely ineffective. It is not chosen because it is effective, however; it is chosen because professors’ time is valuable, professors are very often experts in the subject matter rather than in teaching itself, and the lecture model is arguably the easiest way to present material. There are exceptions, of course, but I don’t think I’m being controversial in suggesting these ideas as generally true.

What the internet gives us, however, is a mechanism by which content can be produced once to be consumed by millions. This is part of the reason the openness initiatives work: professors can film the course once and make it available to the masses rather than having to reteach it semester to semester. But while in some places that is an impetus for openness, we may also use that as an impetus for quality. Let’s invent some numbers to make it clearer. Let’s imagine that a class of 50 students are each paying $100 to take a class; this means that the class must cost no more than $5,000 to deliver each semester. However, if the class could be developed once and re-used ten semesters in a row, that means that the same class now can cost up to $50,000 to develop, allowing for much more investment into the quality of the class.

This, of course, is a gross simplification, but it is intended to portray an elegant truth: when we use the internet to deliver content to a much larger population with the same amount of work, we can either pass on the savings to the students (the openness route), or we can reinvest the money into the development of the courses themselves (the quality route). We can ask less investment of the students, or we can give the students more for the same price.

Coursera, edX, and the traditional MOOC community take the former, providing content for a fraction of the cost because it can be delivered to so many people. Udacity, the Georgia Tech OMS, and other more expensive programs take the latter approach, reinvesting that money into creating higher-quality programs in the first place. Both these sides are critical. I don’t like living in a world where education is gated by such a massive monetary investment, and MOOC services are doing a world of good to reduce the barriers to education. At the same time, I love education itself, and I recognize that there are phenomenal things that the internet can do to improve education — but they come with a significant development cost.

Of course, this hasn’t actually answered the question: I’ve shown how openness and quality are distinct and often conflicting goals in online education, but can we accomplish both? Is it possible to create high-quality education that is also openly available for little to no monetary cost? It may be. At present, this is in some ways what the Georgia Tech OMS is doing: nine Georgia Tech courses are available for free to the world, and they are infused with a more significant initial investment that pays significant dividends in the quality of the instruction. This is accomplished because, in some ways, this free offering is “subsidized” by the students taking the actual Masters. This model is incomplete, however, as there is still valuable education locked within the for-cost program. OMS students are not paying for the videos; they are paying for the access to professors and TAs, the access to projects and assignments, and the ultimate “verified certificate”: the Masters diploma at the end of the program. However, this direction at least illustrates that it may be possible to use one offering in service of the other and improve both openness and quality at the same time. For now, however, I regard the two as distinct, exclusive, and desirable goals.

Should there exist accreditation for independent online courses?

One of the earliest takeaways from the handful of online courses I’m taking at the moment (Emerging Trends & Technologies in the Virtual K-12 Classroom, Learning How to Learn, and Astronomy: Exploring Space & Time, as well as a few days in Planet Earth and You) is that there exists a radical difference in the scope of different courses. Emerging Trends can be completed in a day if desired. Astronomy: Exploring Space & Time requires a much more significant time investment for the videos, but the interactive elements are relatively light, restricted to short quizzes and writing assignments. Planet Earth and You was much more significant, and decently approximated the amount of time I recall dedicating to traditional on-campus courses.

On the one hand, this is fantastic. Online learning has previously had the strength of not having to arbitrarily fit lessons to pre-determined time slot: if a topic takes more than a class period to instruct, it’s not necessary to arbitrarily break it up halfway through, and if it takes less than a class period to instruct, it’s not necessary to pad it out or randomly combine it with another topic. These courses reflect how that same idea can be expanded to an entire course. Not every course needs to be a three-hours-per-week 16-weeks-per-semester course. If a topic can be learned in 10 to 20 hours total (as Emerging Trends’ “2-4 hours per week, 5 weeks of study” guideline indicates), then let it be learned in that time frame.

On the other side, however, what does that say about how the world interprets these courses? A Bachelors is a Bachelors, a Masters is a Masters, and there exists some general understanding of what those degrees mean. The school from which a degree came has some influence, but universities have spent decades of time building up reputations to differentiate a Stanford Bachelors from a Samford Bachelors. Moreover, there are around 2500 four-year universities in the United States, and while that’s a large number, it isn’t intractable as far as developing an understanding of the different equivalence classes of universities.

There are currently around 1500 Udacity, Coursera, and edX courses combined, just to take three of the biggest organizations as examples. In the three years since these three organizations launched, they’ve developed almost as many courses as there are four-year institutions in the country. If the scope, depth, and rigor of individual open courses on these platforms is going to vary this much, how then will the world learn to interpret what it means to have a credential or certificate from these courses?

To put myself in the position of an employer, if a prospective employee had a verified course certificate from Coursera on their resume that I had not yet heard of, that would presently be somewhat meaningless to me; this is not because the certificate has no value, but just because I have no hope of knowing the certificate’s value at a glance, nor is it feasible to maintain a comprehensive knowledge of all the open courses I might see. This is a tremendous challenge to the value of these programs. Paying for a certified certificate is, for many, based on the belief that the ability to prove you completed a course is powerful. But if the people to whom you would offer that proof have no knowledge of what kind of knowledge and achievement that certificate represents, it remains somewhat meaningless.

This discussion comes dangerously close to the general discussion of accreditation. How does an employer know a certain college degree is valuable? Because it has been accredited by an independent organization. That knowledge of the program’s value and rigor has been offloaded to an external group for assessment. Just like a bank checking with a credit agency before deciding whether to give a person a loan, so also a business implicitly checks with an accreditation group before offering a graduate a job.

I wouldn’t argue that we need accreditation in the traditional sense for online courses; after all, many courses that wouldn’t pass a pass/fail accreditation process are nonetheless very valuable, even if they don’t necessarily demonstrate anything reliable about the students themselves. Based on my first impressions, I can’t say that hearing that a teacher has taken Emerging Trends & Technologies would mean much to my impression of them, but there is still lots they may have learned in the course. Rather, I feel what might be necessary is just a somewhat standardized classification system. Just like on-campus classes are assigned a number of credit hours based on the amount of work they require, so also online classes could be assigned a number or classification of virtual credits based on the rigor, reliability, and scope of the course itself. That, in turn, might lead to even greater programs: instead of a single university building up a Coursera specialization, it could instead be assembled from multiple universities’ courses on the basis of their virtual credits.

But apart from the solution, I feel the problem nonetheless exists and is waiting to be addressed: accreditation fulfills a function in traditional education; should online education have something to fulfill the same function?