A blueprint for creating a successful MOOC

Brick-to-click education is not a matter of if, but rather, now. In a constantly growing global community, education modalities are evolving to meet the demands of a knowledge-thirsty and driven population.

One approach to meet these demands is the development of Massive Open Online Courses (MOOCs). MOOCs are intended to:

  • Host large numbers of students.
  • Accommodate an internationally diverse community.
  • Increase access for students from non-traditional regions.
  • Support international marketing and recruitment strategies for universities.

However, the almost-10-year experience with MOOCs has yielded mixed outcomes. The average retention rate for MOOCs is four percent of the enrolled class, which defeats the purpose of providing accessible, available, and equitable education.

Further, MOOCs have traditionally been offered as a one-time event. That raises questions about the sustainability of the approach.

4 keys to successful MOOCs
At St. George’s University, an international center for excellence in medical, veterinary, and environmental sciences, multi-course and multi-delivery MOOCs have yielded more than five times the average retention rates over a five-year period from 2013 to 2018. Here are the lessons we learned.

1. Develop the team
The old adage “no man is an island” holds true for MOOCs. From the very beginning of the course concept through post-course evaluations, MOOCs require and should include certain skill sets: faculty, an instructional designer, and a content editor.

The faculty will serve as the academic resource for the course. The learning strategist will adopt online technology and tools toward course delivery. The content editor will finalize and produce the course materials in text, audio, and video formats.

2. Know your audience
MOOCs are free and open for enrollment for nearly anyone in the world, which makes it difficult to know your audience. To address this issue, survey the enrolled students prior to the course and/or gather demographic details of students enrolled so you can cater the course to their interests.

Another approach is to focus the theme, content, and context of the course for the type and cohort of students. Meeting the needs, interests, and expectations from your audience is a critical step in the design and delivery of MOOCS.

3. Choose the right learning management system
MOOCs are routinely delivered using established online platforms that host the courses and handle marketing and recruitment. Most online platforms include a template for content, as well as a list of tools offered to deliver the course. Identify a platform that has a successful track record, will be user-friendly to students, and provides a simple delivery approach.

tags

Going beyond the hype: How AI can be used to make a difference

Reference to artificial intelligence (AI) has become strategic in higher-ed discourse, joining the terms “big data” and “predictive modeling.” When I was introduced to AI in 2013 by a member of our design team, it captivated my imagination. Since then, as our data grew to proportions that were ripe for AI, I’ve become enthralled by its potential to enrich the accuracy and personalization that leads to better outcomes. That does not make me an expert.

If anything, it could make me equal to all out there who have wondered what these terms actually mean, how they matter to education, and where to draw the line between hype and results.

Defining the terms

Artificial intelligence is the broader concept of machines being able to carry out tasks in a way that we would consider “smart.”

Machine learning (ML) is a current application of AI based around the idea that we should really just be able to give machines access to data and let them learn for themselves (Bernard Marr, author, speaker).

Note: Neither definition implies the machine outsmarts or replaces its human team.

AI in healthcare

Like higher ed, healthcare systems are complex, tethered to random human behavior, constantly evolving, and looking to technology to help improve outcomes for all. I found the following healthcare example helpful for visualizing how AI makes it possible to take human expertise and replicate it at scale.

In a healthcare experiment, AI scientists worked with experts at diagnosing a certain type of lung cancer. Collectively, these teams converted expert knowledge into a set of rules and decision trees for reading a lung cancer X-ray and determining diagnosis. In the end, the machine “student” outperformed the very experts who designed its rules.

For some people this outcome seems obvious and acceptable. For others, the thought of relying on a machine diagnosis over a trusted doctor is not acceptable. The reason the machine outperforms the human, while following the same rules, is that the machine is free from bias, second-guessing, fatigue, or distraction. The AI machine becomes a critical member of the team—not a replacement.

tags

6 big-impact technologies on the higher-ed horizon

Analytics technologies, makerspaces, and redesigning learning spaces are just a few of the numerous technology developments and trends outlined in a preview of the forthcoming annual Horizon Report.

The Horizon Report was on shaky ground after the New Media Consortium unexpectedly shut its doors in early 2018, but EDUCAUSE acquired the rights to the report and continued the research.

The annual report outlines issues, technologies, and trends that higher-ed leaders should follow and keep in mind as they outline institutional priorities.

It includes three parts: key trends accelerating technology adoption in higher education, significant challenges impeding technology adoption in higher education, and important developments in educational technology.

Ed-tech trends

Higher-ed trends are organized in terms of time of impact. Advancing cultures of innovation and cross-institution and cross-sector collaboration are long-term trends expected to drive ed-tech adoption for five or more years.

tags

Students get smart about college ROI

Financial concerns are consistently identified as a top roadblock to higher education, and for good reason—securing scholarships and financial aid, along with carrying burdensome student loans, can overwhelm students before they even earn a degree.

Research shows that almost 3 million students drop out of school each year due to financial constraints. It also reveals that more than half of institutions don’t have, or are unaware of what they use for, automated scholarship management.

Most K-12 district leaders say it’s important to create a college-going culture, but they also cite concerns about paying for college prevent many of their students from applying in the first place. Students say another top barrier is difficultly matching potential careers to their interests, which is something reports about overall ROI, including career outlook, can address.

Students are increasingly concerned about their ROI, or their return on education, once they earn a degree—and a handful of new tools can help them get a better idea of the financial reality attached to higher education.

tags

How to maintain the balance between security and privacy

We’re in a unique moment in history, where the negative consequences of organizations tracking our digital traffic are painfully clear. It’s certainly understandable that “security measures” can seem to many people more like intrusive surveillance than personal protection. But a lack of defenses will also have negative consequences for our safety and feeling of trust.

What can security professionals in higher ed do to maintain the balance between safety and privacy? Is it possible to maintain trust in the institution and yet enable users to explore safely?

The importance of context

Consider security and safety analogies in the physical realm such as security guards or checkpoints. Everyone has his or her own sense of what seems obtrusive and what is welcome. There are questions that can help predict where security measures will fall on the acceptable-to-intrusive continuum:

  • Is the area being secured… a personal area? a public space? a sensitive administrative department?
  • If the secured area is public, are you inspecting everyone and everything and removing whatever or whomever could be considered suspicious? Or are you checking a list for specifically dangerous people or items?
  • Are the criteria decided fairly and applied equally? Are there effective methods to correct and augment the list quickly if there are errors or omissions?
  • Are records kept of everyone and everything that entered and exited this area?
  • Are security measures applied by an outside authority or can people apply it to protect themselves?

Generally speaking, public or personal areas are expected to operate with little to no proactive controls. As long as people have access to effective and timely reactive measures, a sense of safety can be maintained. Sensitive areas are expected to be under a certain amount of scrutiny, as long as that scrutiny is applied fairly and transparently.

Context in action

In an educational environment, there are areas that must be publicly accessible and relatively unrestricted and areas that should remain private to the individuals or groups who use that space. There are also areas that should be tightly controlled, such as financial, healthcare and administrative information.

In areas that should be tightly controlled, there are few people who would take issue with closely monitoring activities and restricting users’ ability to perform activities outside those strictly required to do those necessary, sensitive tasks. The opposite extreme would be personal repositories or computers within housing areas of your network, which should have minimal monitoring or restriction. Most other systems, machines, and users fall somewhere in between.

tags

Suite of 8 digital badges highlights 21st-century skills

A new set of digital badges aims to help educators and industry leaders better evaluate students’ 21st-century skills.

The 21st Century Skills Badges initiative, from Education Design Lab, is based on three years of research, design, and pilots and offers a suite of eight digital badges, often called microcredentials, along with a facilitator’s toolkit, to help educators and employers understand the skills students have cultivated.

Education Design Lab partnered with 12 universities and 50 employers to develop the badges. Students can display them on LinkedIn accounts or resumes.

Employers say they have an increasingly difficult time finding highly-qualified applicants for job openings, particularly in STEM fields such as computer science. The digital badges are designed to be “machine readable” by search algorithms recruiters use to identify potential job candidates.

tags

Unbundling the 4-year degree: How to design education for the future

Employers are in desperate need of skilled workers to address current employee shortages and prepare for projected disruption in the workplace. For example, artificial intelligence will create 2.3 million jobs while eliminating 1.8 million by 2020, according to a 2017 Gartner report.

To fill jobs now while preparing for the future, countless organizations are rethinking how students learn and earn skills in postsecondary education. Such a change requires new mindsets for institutions and businesses.

The rise of micro-credentials

Perhaps the biggest trend that has the attention of colleges and universities is “micro-credentialing,” as enrollment continues to decline in traditional college degree and master’s programs.

“As we move toward an ecosystem of skills- or competency-based hiring, employers will care less about the degree itself,” says Kathleen deLaski, founder and president of the nonprofit Education Design Lab (The Lab), an organization that works with more than 70 institutions and employers to prepare students to fill jobs, and offers opportunities to earn employer-desired skills. “For liberal arts degrees particularly, institutions have to think about how to compete at the competency level, not the degree level, because that’s what consumers will expect in many disciplines.”

deLaski believes that college departments offering majors that prepare students for regulated industries that require degree-level certifications may find it easier to keep their full degree-level requirements in place, such as pre-med and K-12 teaching.

“Many proponents of student success argue that employers will continue to require a degree for most roles,” she says, “but once employers start accepting ‘shortcuts’ or ‘alternatives,’ and once competency-based hiring gains steam, the pace of disruption will quicken.”

The tight labor market already has employers like IBM, Walmart, and Amazon experimenting with alternatives to the four-year degree. In response, higher education is rethinking the value of the degree, accelerated by pressures like the Internet of Things, automation, student debt, and wage stagnation. In fact, a recent Wall Street Journal poll found that less than half of Americans believe that a four-year degree is “worth it.”

The drivers of change

To create real change in the higher-ed landscape, deLaski outlines key areas that have potential to transform higher education toward the future of work.

Microcredentials for 21st century skills

Microcredentials in higher education have exploded in the past couple years; one in five colleges offer digital badges. To support institutions and employers in defining standards for badges, Education Design Lab designs and tests rigorous courses that enable students to hone in-demand skills desired by employers.

tags

9 trends shared by innovative community colleges

Supporting mobile devices is a top priority among a majority of community colleges surveyed in the Center for Digital Education’s annual Digital Community Colleges Survey, which offers an inside look at schools’ technology and innovation priorities.

Other priorities include cybersecurity tools and testing, redesigning or upgrading websites, upgrading classroom technologies, digital content and curriculum, and disaster recovery/business continuity.

According to the survey, 34 percent of community colleges have a strategy in place for the use of mobile devices; 35 percent have a full-time chief information security officer or a similar full-time role; 71 percent of surveyed community colleges’ websites have responsive web design; and 88 percent have off-site data storage redundancies in place.

The survey names three winners, grouped by enrollment size, based on how they use a range of technologies to improve services for students, faculty, staff, and the community. The three winners are J. Sargeant Reynolds Community College, Va. (Enrollment 10,000 students or more); Walters State Community College, Tenn. (Enrollment between 5,000 and 10,000 students); and Manchester Community College, N.H. (Enrollment of 5,000 or fewer students).

tags

A quick look at cloud terminology

The cloud may be easier and more affordable than advertised, but it isn’t free. Still, computing horsepower is finally a virtual (or, perhaps more appropriately, a virtualization) bargain. It’s entirely possible for your college or university to spend $10K a month and tap enough power to drive a 1,000-user organization. That’s less than the cost of hiring a single engineer (even if it may sound like overkill, especially given today’s budget realities).

It’s essential to place your applications and data in a maximum-security environment. Hosting plans should be designed expressly to deliver both data integrity and data protection, deploying technologies such as clustered firewalls and intrusion detection and prevention software, which is capable of detecting threats to sensitive client data that even the best firewall won’t catch. And as cyber threats become ever more insidious, those in higher education are looking to implement systems that go well beyond basic malware and antivirus “solutions.”

In IT, as in higher education, language is everything. Teaching undergraduates is tough enough; most university administrators would prefer not to wade into the fog of IT, especially given just how obtuse the tech world has become. Familiarizing yourself with some basic terminology won’t turn you into an expert but it can provide a grounding in the fundamentals. With that in mind, let’s look under the hood and decipher some of the more pervasive and vexing terms.

Public cloud? Private cloud? Hybrid cloud?

As the cloud has expanded, it more or less subdivided:

  • private is proprietary or internal to one organization
  • public is when service providers make applications and storage available to any business over the Internet, typically for a monthly usage fee
  • hybrid is a blend of public and private. Some of your workload is under your control, some outside of your control, and some situations mix the two. These days, the hybrid cloud is ubiquitous.

The right question isn’t, “Should I opt for the hybrid cloud, the public cloud, or a private cloud?”

The smart question is, “What’s strategically best for my institution?”

tags

Why basing college accreditation on outcomes, not inputs, makes sense

To understand their children’s progress in school, reasonable parents would likely look at their report cards, ask them how things are going, or talk to their teachers. But, were they to evaluate their children’s education the way colleges and universities are evaluated, parents would instead be metaphorically digging through their children’s backpacks and assessing the quality of their pens and paper.

Accreditation—the critical system for assessing whether institutions quality for federal financial aid—is too focused on inputs, such as faculty degrees and shared governance, and pays far too little attention to outcomes, such as graduation rates, job placement rates, and wage growth. This input-driven approach results in higher costs, as schools add complexity and features in order to meet accreditation standards. But it doesn’t protect students from enrolling in schools with low completion rates and weak return on investment.

As Congress looks to reauthorize the Higher Education Act, it has the power to change accreditation for the better.

How did we get here?

Our current system of accreditation was initially designed by colleges themselves, well before federal involvement in higher education. In the late 19th century, higher-education enrollment was expanding and the number of colleges was proliferating. This created problems for schools accepting transfer students from other institutions. As new colleges sprang up, there was considerable variability across even basic elements of their programs, such as whether high school completion was a requisite for admission. Accreditation was developed as a voluntary peer-review system to ensure that colleges within a given association had similar standards of preparation and academic rigor, thus facilitating transfers and articulating credits.

In the mid-20th century, the federal government began investing in higher education, first with the GI Bill, and then even more significantly with the Higher Education Act of 1965 (HEA). The government leaned heavily on the existing system of accreditation in order to determine which programs would be eligible for federal funds. Accreditation, which began as a private, voluntary system to solve specific problems of interoperability among institutions, now became the mechanism to guide federal investments in higher education.

tags