With an abundance of tech solutions and tools, IT must ensure systems support their institutions’ goals effectively and affordably. Here’s how.
[*Editor’s Note: This article appears in our March/April digital publication. See all the articles, as well as this month’s Symposium on “Whither Liberal Arts” here: http://ecampusnews.eschoolmedia.com/current-issue/.]
Just two years ago, higher ed CIOs were scrapping for a seat at the table. Now they’re firmly in the hot seat.
In that short span, IT has become so central to campus operations that its performance has a direct impact on the quality of teaching and learning. Furthermore, with the consumerization of IT, faculty and student expectations have risen dramatically, leaving IT little room for error, whether in the wireless network or the LMS interface. To thrive within this cauldron, IT must develop policies and best practices to help it evaluate, implement, and then re-evaluate the systems on campus. eCampus News looks at 10 keys for success:
1) Maintain tech flexibility. Like it or not, students and faculty expect tech services that rival what they can find off-campus. Meeting those expectations is almost impossible if your school is tied down by unwieldy legacy systems. “It used to be that our environment was one massive ERP system like Banner or PeopleSoft, and we would use only the tools that were inherent to Banner or PeopleSoft,” said Paige Francis, CIO of Fairfield University in Connecticut. “As we rely on more and more technology solutions, we’re now looking for smaller components that might integrate with larger components. We are at a point where we have decided not to count anything out.”
As long as additional components are compatible with Banner (the release of web-based Banner XE is expected to make interoperability far simpler), Francis treasures the flexibility of being able to scout other solutions. The school is currently looking at Salesforce.com, for example, as a possible new CRM.
In the fizzing tech sector, it’s also important to remember that today’s hot product may be yesterday’s news. Schools should be wary of entering into long contracts with companies that could become has-beens in a few years. “The last thing that you want is to get roped into a 10-year contract,” explained Francis. “I would take a one- to three-year contract, but I would also want language that gives us an out in certain circumstances, because technology is moving so quickly right now.”
(Next page: 2-5)
2) Establish requirements. It’s no good evaluating potential tech solutions if IT is unclear what problem it’s trying to solve in the first place. “It’s really important that due diligence is done at the very beginning before you even look at a system and that you understand what you’re trying to accomplish,” explained Toni Raftery, associate program director for the MBA and MSL (Science in Leadership) programs at Norwich University in Vermont.
As an example, Raftery pointed to Norwich’s contract-renewal process for adjunct faculty, which was beset by problems. “Before I even looked at a technology, I wanted to map out the process,” she recalled. “I got all the major stakeholders into a room, and we spent a half day talking about how we could make the process better.” In the course of this research, she unearthed myriad workflow issues and unnecessary routing of documents.
She also discovered that processing the contracts each term took 14 days, cost $2,670, and consumed 61 minutes of staff time per contract. Only after Raftery had a firm grasp of the problem at hand did she begin her search for a tech solution. Ultimately, the school selected OnBase, which was already in use in the admissions office.
“Before implementing any technology, the most important thing is the process,” advised Raftery. “If it’s a bad process and you try to throw it into a technology system without doing all the pre-work, it’s going to be a disaster. Technology isn’t always the answer.”
It’s also important not to develop a requirements document in a vacuum. Colleagues at other institutions and organizations can often share insights (see Talk to Other Institutions, below), as can vendors. At the University of Wisconsin-Madison, IT sends a Request for Information to vendors to help the school flesh out its requirements document.
“We put an RFI out so that vendors can tell us the state of the most contemporary e-commerce system, for example, along with the latest features and benefits’,” explained Brian Rust, communications director for the Division of Information Technology at UW-Madison. “We may say, ‘Gosh, we hadn’t thought about this other aspect of e-commerce, but it would be great to include it in our Request for Bid.'”
3) Secure buy-in. Nothing, it seems, ticks faculty off more than the feeling that IT is shoving tech solutions down their throats. Indeed, failure to secure buy-in (along with lack of faculty training) may be the leading cause of tech death on campuses. The process of winning faculty and staff support needs to be an integral part of how IT operates, and success begins with ensuring that feedback loops are in place (see Solicit Feedback, below). If faculty and staff feel they’ve been heard, they are far more likely to sign up for the ride.
Even with established feedback loops, additional strategies can increase the chances that a system will be successfully adopted on campus.
As viral marketers have discovered with online media, it’s worth targeting those taste-makers on campus whose opinions are highly regarded. It’s equally important to pull in those naysayers who are most likely to torpedo an initiative, as Raftery learned during her faculty-contract project. “When I put my team together, I purposely chose people who would probably resist change the most,” she recalled. “Even though it definitely made some of the meetings more difficult, it was key to implementing the new system. I made sure they were involved from the start and I let them get all their concerns out.”
4) Analyze the vendor, not just the product. It’s important to remember that, in most cases, a school is not just buying a product: It’s also hiring the company that will likely provide service for the product. It’s worth researching the service reputation of vendors as part of the due diligence, and looking for clues that the school’s business is actually important to the vendor. When Norwich University was shopping for a new LMS, for example, vendors were invited to fly up to the Vermont campus to demonstrate their products. “Flying to Vermont is quite a thing,” explained Raftery. “It’s like, ‘How badly do you want our business?'” Four vendors accepted the invitation; one, whose product was one of the initial front-runners, declined. Needless to say, it didn’t get the contract.
5) Solicit feedback and more feedback. The need to obtain feedback from stakeholders and constituents is hammered home at almost every gathering of higher ed IT leaders—for good reason. Feedback is far and away the most important element for ensuring that IT’s systems provide value and support the school’s mission. Yet, among faculty and staff at institutions nationwide, the failure to consult them continues to be seen as one of IT’s biggest weaknesses.
How IT shops solicit feedback from their constituents varies from institution to institution, often depending on their size. At smaller institutions, the process is often more casual and organic, but establishing formal feedback loops ensures both transparency and inclusiveness.
Regularly scheduled meetings, either in groups or one-on-one, have proven their worth. At Fairfield University, for instance, Francis meets once a semester with the deans and the associate deans from each school and college to gather feedback. It’s a similar story at UW-Madison. “We talk to customers quite a bit about how their work habits are evolving, and how their requirements are changing,” said Rust. “We ask them whether the tools, services, and resources are in need of change.”
Most schools also have a committee that serves as a liaison between faculty and IT. At Fairfield, for instance, the Educational Technologies Committee compiles faculty feedback on new systems and IT issues. “It’s faculty-led, faculty-driven,” noted Francis. “A few of us from administration attend but we don’t have voting privileges. We’ve got a good mix of individuals around the table.”
As the customer who’s ultimately footing the bill, students need a voice, too. UW-Madison operates a Student Advisory Group of about 25 students who meet every month. “We ask them for feedback on a variety of specific topics,” said Rust. “That feedback is channeled back into adjustments and improvements to the services we provide.”
The other tried-and-true feedback mechanism is surveys. UW-Madison does an annual survey of a sample of faculty and staff to get their input on core IT services, and also conducts an annual survey of students. For its part, Fairfield participates in the annual EDUCAUSE Center for Applied Research (ECAR) survey, which goes out to a sampling of faculty and students on campus. “It allows us to get feedback not only on what we’re doing right, but also to gather information about what tools individuals are using,” said Francis.
But Francis cautions against surveys that don’t include a text box. “You need to give people that open forum to give you feedback on how things are working,” she noted. “It can either identify, ‘Hey, we need to locate a new solution,’ or ‘Here’s a training opportunity.'”
(Next page: 6-10)
6) Talk to other institutions. No school on its own can hope to stay on top of the rapid developments in the tech world, so it’s important to tap the knowledge and experience of colleagues at other organizations. UW-Madison, for example, looks to a variety of outside groups including EDUCAUSE; the Committee on Institutional Cooperation, which comprises the Big Ten schools plus the University of Chicago; as well as various professional organizations that cover everything from networking to IT security.
“We keep in regular contact with them regarding best practices and issues,” said Rust. “We discuss what issues we’re seeing, plus we keep an ear toward what’s going on in the commercial sector.”
It’s a similar story at Fairfield University, which turns to sister schools in the Northeast Regional Computing Program (an associate of EDUCAUSE), as well as the 28 institutions in the Association of Jesuit Colleges and Universities. “When I have questions about any service or software, I get immediate feedback from 27 of my peers, who may be a step ahead of us or a step behind,” said Francis. “We really rely on each other for these decisions.”
7) Conduct pilots. In this era of heightened service expectations and cost containment, no school can afford to put in a system that simply doesn’t cut it. “We don’t just plug-and-play and then stop using a system if people don’t like it—we don’t have that luxury,” said Rust. “Our audiences have extremely high expectations for reliability, security, integrity of data, and processes.”
Enter the pilot project. Pilots not only give schools a chance to test a system before they invest in it—but they also give IT a chance to work out any kinks. Fairfield University, for example, has just completed a cutting-edge classroom featuring mobile furniture, multiple whiteboards, and a projection system that can be controlled from any device. For now, teachers can only sign up to use the room for one-off trial classes, and must agree to provide feedback on the room’s performance.
“It’s something we can do upfront before we mirror this classroom across campus,” explained Francis. “We’re doing it in baby steps to find out what we’re doing right before we invest that money.”
A pilot project also helps introduce the product to the broader campus community in a non-threatening way. “Whenever I launch a project, I always run it as a pilot first,” said Raftery. “People are a little more forgiving because nothing works exactly the way you think it’s going to.”
8) Set goals and measure performance. The task of measuring IT performance is best divided into two spheres: technical infrastructure and public-facing systems. For the most part, the performance of technical infrastructure, such as networks or virtual servers, can be minutely monitored. Monitoring tools are available across the spectrum and often provide real-time visibility into the performance of IT systems. Given the availability of all this performance data, the decision to keep, upgrade, or replace a system can be calculated with a certain degree of precision.
Establishing metrics for public-facing systems, on the other hand, is far muddier. “It’s easier to monitor the network than it is to monitor a researcher’s satisfaction with the suite of software and other tools that we make available,” said Rust, who emphasized the importance of feedback to help gauge the performance of these systems. “When you start benchmarking an LMS, say, we may set a combination of more technical metrics—uptime and response time—and softer benchmarks including satisfaction with a particular system or set of tools.”
Given these “softer benchmarks,” the holy grail of calculating return on investment becomes a pipe dream. “Calculating ROI is almost impossible,” said Francis. “At the administration level, so much of it is intangible. Although we may identify a solution that saves a student time and possibly money, we never see those savings because it’s the student’s time and the student’s money.”
9) Train faculty and staff. Campus IT could install the best technology solution on the planet and it wouldn’t be worth a hill of beans if faculty and staff were unable or unwilling to use it. It’s also no secret that faculty training is one of the toughest tasks on campus—technology is simply not a faculty focus. While IT’s first responsibility is to implement solutions that are as intuitive as possible, that’s only the beginning.
“Many people in higher ed don’t focus enough on the training of faculty and staff on these new tools,” said Francis, who noted that recent ECAR surveys showed that Fairfield students felt faculty don’t have enough knowledge about the technology they’re using in class. “That was one of the biggest indicators that we need to focus more on training. We’re not going to roll something out unless we also have a bag of money to train our campus.”
10) Live with compromise. No matter how many surveys, product assessments, and requirement-gathering sessions are conducted, it’s impossible to meet every constituent need. A case-in-point is what happened at UW-Madison when it attempted to whittle down the number of e-mail/calendaring systems on campus from 30 to just 1. Among many faculty and students, Google was the hot favorite, according to Rust. But one of the key requirements was that all data be stored in domestic servers because of its often-sensitive nature. Some faculty and staff, said Rust, “didn’t really think about that, because they’re only thinking about what they might need in particular.”
Several other vendors were disqualified on the same grounds. Ultimately, UW-Madison decided to go with Microsoft Office 365. With IT caught between compliance regulations and myriad user preferences, it was the system that met the most campus needs.
“Regardless of how thorough you are in collecting requirements and getting input, there will be people who will not be satisfied with your solution,” said Rust. “You have to understand that, otherwise you’ll be constantly conflicted about whether your services are satisfactory or meeting the need.”
Andrew Barbour is an editorial freelancer with eCampus News.