Abstract

New data acquisition techniques are generating data at much finer temporal and spatial resolution, compared to traditional seismic experiments. This is a challenge for data centers and users. As the amount of data potentially flowing into data centers increases by one or two orders of magnitude, data management challenges are found throughout all stages of the data flow.

The Incorporated Research Institutions for Seismology—Réseau sismologique et géodésique français and GEOForschungsNetz data centers—carried out a survey and conducted interviews of users working with very large datasets to understand their needs and expectations. One of the conclusions is that existing data formats and services are not well suited for users of large datasets. Data centers are exploring storage solutions, data formats, and data delivery options to meet large dataset user needs. New approaches will need to be discussed within the community, to establish large dataset standards and best practices, perhaps through participation of stakeholders and users in discussion groups and forums.

You do not currently have access to this article.