Christine L. Borgman’s Big Data, Little Data, No Data advances a rigorous corrective to technological triumphalism by arguing that data are neither self-evident objects nor neutral by-products of scholarship, but representations of observations, artefacts and phenomena that acquire evidential force only within interpretative, institutional and infrastructural contexts. The presentation’s early contrast between open access policies and diverse disciplinary datasets establishes the central tension: while governments, funders and universities increasingly demand openness, the rights, responsibilities and risks attached to data remain uneven across scientific, social-scientific and humanistic domains. This complexity is crystallised in the repeated proposition that data are not publications, not natural objects, and dependent upon knowledge infrastructures. The visual sequence reinforces this argument: page 23 juxtaposes mice, notebooks, maps, climate models and qualitative field notes to show data’s heterogeneity, while pages 26–28 contrast sensor-network measurements with survey and Twitter-based social science materials. As a case synthesis, Borgman’s sensor-network example demonstrates that reuse requires far more than access: nitrate-distribution readings become scholarly evidence only through metadata, provenance, calibration, classification, repositories and labour. The later discussion of economics, sustainability and libraries extends the claim by showing that research data oscillate between public goods, private goods and common-pool resources. The definitive implication is that open data must be governed as a socio-technical commons: curated, credited, preserved and interpreted through institutions capable of sustaining scholarly memory beyond immediate publication cycles.