A new tool might help fight malicious editing that introduces incorrect or misleading information in online sites such as Wikipedia, UPI reports. University of Iowa researchers are developing a software tool that can detect potential vandalism and improve the accuracy of Wikipedia entries, a university release says. The tool is an algorithm that looks at new edits to a page and compares them to existing words in the rest of the entry, then alerts an editor or page manager if it senses a problem. There are existing tools that spot obscenities or vulgarities, or major edits, such as deletions of entire sections, or significant edits throughout a document. But those tools are built manually, with prohibited words and phrases entered by hand, so they’re time-consuming and easy to evade, the UI researchers say. Their automatic statistical language model algorithm works by finding words or vocabulary patterns that it can’t find elsewhere in the entry at any time since it was first written. For instance, when someone wrote “Pete loves PANCAKES” into the Wikipedia entry for Abraham Lincoln, the algorithm recognized the graffiti as potential vandalism after scanning the rest of the entry and not seeing any mention of pancakes…

Click here for the full story

About the Author:

Denny Carter

Dennis has covered higher education technology since April 2008, having interviewed some of the most recognized IT pros in U.S. colleges and universities. He is always updating eCampus News with the latest in pressing ed-tech issues, such as the growing i


Add your opinion to the discussion.