Nogueras-Iso, J., Lacasta, J., Ureña-Cámara, M.A. and Ariza-López, F.J. (2017) ‘Quality of Metadata in Open Data Portals’, IEEE Access. doi: 10.1109/ACCESS.2017.DOI.

Nogueras-Iso, Lacasta, Ureña-Cámara and Ariza-López argue that the proliferation of Open Data portals has made metadata quality a decisive condition for discoverability, interoperability and reuse, because datasets cannot be effectively found, understood or accessed when their descriptions are incomplete, inconsistent or semantically imprecise . Their central contribution is to adapt an ISO 19157-based quality evaluation method, originally developed for geographic metadata, to the broader field of Open Data metadata structured through W3C’s DCAT vocabulary. The paper shows that Open Data initiatives often prioritise rapid publication through platforms such as CKAN or DKAN, yet this technical ease can conceal weak metadata practices that undermine transparency and public value. The authors therefore propose a rigorous evaluative framework combining automated and manual controls across dimensions such as completeness, logical consistency, temporal quality, thematic accuracy, positional correctness and free-text quality. A significant case study is the Spanish Government’s Open Data catalogue, datos.gob.es, whose metadata are assessed through measures including SPARQL-based checks, acceptance quality limits, manual sampling and representation of results through the Data Quality Vocabulary. The figures on page 10 clarify the workflow for automated quality reporting and the temporal logic used to verify whether publication, modification and validity dates are coherent. Ultimately, the article demonstrates that metadata are not secondary administrative supplements but epistemic infrastructures: they determine whether open data can genuinely function as public knowledge, reproducible evidence and reusable civic resource.