Jimmy Sarakatsannis, engagement manager at McKinsey Global Institute, said that for data to be considered truly “open,” it must have four characteristics.
“It must be accessible to all,” Sarakatsannis said. “It must be machine-readable. It must be low-cost. And it must have unrestricted rights to be reused and used for innovation.”
Very often, the speakers said, when data is released to the public, it is done so in a file format, like a .PDF, that can only be read by humans. When that’s the case, the data can’t be catalogued, organized, or analyzed by a computer.
Similarly, some data, while deemed publicly available,is only made accessible after a person or organization files a freedom of information act (FOIA) request.
Recently, one think tank FOIAed an agency for data and released the information on a website, said John Bailey, executive director of Digital Learning Now!, who moderated the event.
The systems used by the federal government to organize that data is so archaic and complicated, Bailey said, that the same agency now uses the think tank’s website for research rather than its own collection.