Science Operations 2015 - Programme

Below you find the Science Operations 2015 Programme, last update was done on November 23rd, 2015.

REGISTRATION: Tuesday 24 November, 13:00-14:00

Program Overview

 
TUESDAY, 24 November 2015
Chair: Bruno Leibundgut, Venue: ESO Auditorium (Eridanus)
Time (Garching) Speaker Topic
    The Context 14:00 - 17:25
14:00 - 14:15 Andreas Kaufer
ESO, Chile - Track: Context

DOI 10.5281/zenodo.34578
14:15 - 14:30 Martin Kessler
ESAC, Spain - Track: Context

DOI 10.5281/zenodo.34579
14:30 - 15:05 Christophe Arviset
ESAC, Spain - Track: Context

The ESAC Science Data Centre (ESDC) provides services and tools to access and retrieve science data from all ESA space science missions (astronomy, planetary and solar heliospheric). The ESDC consists of a team of scientists and engineers working together and in very close collaboration with Science Ground Segments teams. The large set of science archives located at ESAC represent a major research asset for the community, as well as a unique opportunity to provide multi missions and multi wavelength science exploitation services. ESAC Science Archives long term strategy is set along the main three axes:

(1) enable maximum scientific exploitation of data sets;

(2) enable efficient long-term preservation of data, software and knowledge, using modern technology and,

(3) enable cost-effective archive production by integration in, and across, projects

 

The author wants to thanks all the people from the ESAC Science Data Centre and the mission archive scientists who have participated to the development of the archives and services presented in this paper.

DOI 10.5281/zenodo.34493

15:05 - 15:40 Martino Romaniello
ESO, Garching - Track: Context

Providing the best science data is at the core of ESO’s mission to enable major science discoveries from our science community. I will briefly describe the steps that ESO undertakes to fulfill this, namely ensuring that instruments are working properly, that the science content can be extracted from the data and, finally, delivering the science data to our users, PIs and archive researchers alike.

Metrics and statistics that gauge the results and impact of these efforts will be discussed.

DOI 10.5281/zenodo.34607

  Coffee break
 
16:10 - 16:45 Alberto Accomazzi
CfA, Harvard - Track: Datalinks

The NASA Astrophysics Data System (ADS) has long been used as a discovery platform for the scientific literature in Astronomy and Physics.  With the addition of records describing datasets linked to publications, observing proposals and software used in refereed astronomy papers, the ADS is now increasingly used to find, access and cite an wider number of scientific resources.  In this talk, I will discuss the recent efforts involving the indexing of software metadata, and our ongoing discussions with publishers in support of software and data citation.  I will demonstrate the use of ADS's new services in support of discovery and evaluation of individual researchers as well as archival data products.
16:45 - 17:05 Pascal Ballester
ESO, Garching - Track: Strategy

Scientific software development at ESO involves defined processes for the main phases of project inception, monitoring of development performed by instrument consortia, application maintenance, and application support. We discuss the lessons learnt and evolution of the process for the next generation of tools and observing facilities.
17:05 - 17:25 Bruno Merin
ESAC, Spain - Track: Context

The ESAC Science Data Centre, ESDC, is working on a science-driven discovery portal for all its astronomy missions with the provisional name Multi-Mission Interface. The first public release of this service will be demonstrated, featuring an interface for sky exploration and for single and multiple target searches. It requires no prior knowledge of any of the missions involved. From a technical point of view, the system offers all-sky projections of full mission datasets using a new-generation HEALPix projection called HiPS; detailed geometrical footprints to access individual observations at the mission archives using VO-TAP queries; and direct access to the underlying mission-specific science archives.

A first public release is scheduled before the end of 2015 and will give users worldwide simplified access to high-level science-ready data products from all ESA Astronomy missions plus a number of ESA-produced source catalogues. A demo will accompany the presentation.

DOI 10.5281/zenodo.34581

18:00 Reception
 
WEDNESDAY, 25 November 2015
Chairs: Danny Lennon, Martino Romaniello, Venue: ESO Auditorium (Eridanus)
Time (Garching) Speaker Topic
    Strategy 9:00 - 10:30
9:00 - 9:35 Frederic Hemmer
CERN, Geneva - Track: Strategy

TBD
9:35 - 10:10 Denis Mourard
10:10 - 10:30 Marina Rejkuba
ESO, Garching - Track: Context

User support department at ESO provides support to users of the ESO facilities (in particular Paranal Service Mode users) in proposal and observation preparation as well as participates in instrument operation teams activities. In particular in terms of support for observation preparation the observing strategy for ground based observations is optimized in order to best fulfill the scientific goals as well as to enable the efficient use of the instruments on Paranal.

The service mode support has developed over the past 15 years and in this talk I will highlight some of the tools, projects and examples where support provided to users and instrument operations teams was beneficial. The high trust of the ESO community in the service and data quality that the observatory delivers is testified by continued increase in requests for Service Mode observing.

DOI 10.5281/zenodo.34604

  Coffee break
 
    Data Processing 11:00 - 17:55
11:00 - 11:20 Steffen Mieske/Burkhard Wolff
ESO, Chile and ESO, Garching - Track: Processing

We will present a summary of the procedures in place at the VLT to ensure in real-time the quality of scientific and calibration data taken during the night, to monitor completeness and validity of calibrations in general, and to close the feedback loop with all stakeholders involved.
11:20 - 11:40 Vicente Navarro
ESAC, Spain - Track: Processing

One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels.

During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the “legacy” phase, there are very limited funds and long-term preservation becomes more and more difficult.

Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions:

  • PIA for ISO (1995)
  • SAS for XMM-Newton (1999)
  • Hipe for Herschel (2009)
  • EXIA for EXOSAT (1983)

Following goals have guided the architecture:

  • Support for all operations, post-operations and archive/legacy phases.
  • SSupport for local (user’s computer) and cloud environments (ESAC-Cloud, Amazon - AWS).
  • SSupport for expert users, requiring full capabilities.
  • SProvision of a simple web-based interface.

This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

DOI 10.5281/zenodo.34582

11:40 - 12:00 Roland Walter
University of Geneva - Track: Mission/Project

High-energy astrophysics space missions have pioneered and demonstrated the power of legacy data sets for generating new discoveries, especially when analysed in ways original researchers could not have anticipated.

The only way to ensure that the data of present observatories can be effectively used in the future is to allow users to perform on-the-fly data analysis to produce straightforwardly scientific results for any sky position, time and energy intervals without requiring mission specific software or detailed instrumental knowledge.

Providing a straightforward interface to complex data and data analysis makes the data and the process of generating science results available to the public and higher education and promotes the visibility of the investment in science to the society. This is a fundamental step to transmit the values of science and to evolve towards a knowledge society.

DOI 10.5281/zenodo.34657

12:00 - 12:20 Steven Crawford
South African Astronomical Observatory - Track: Processing

PySALT is the python/PyRAF-based data reduction and analysis pipeline for the Southern African Large Telescope (SALT), a modern 10m class telescope with a large user community consisting of 13 partner institutions. The suite of tools developed as part of PySALT include: (1) science quality reductions for the major operational modes of SALT include high-time resolution and spectroscopic reductions, (2) quick-look capabilities for the observers and real-time data delivery for the investigators, and (3) management of the data archive and regularly processing of SALT observations. In addition to presenting the overall framework, we also highlight some of the lessons learned since the start of SALT scientific observations in 2011.
12:20 - 12:40 Reinhard Hanuschik
ESO, Garching - Track: Processing

In the past two years ESO has installed a production line for level 2 science data products. Focussing on spectroscopic observations, these in-house generated data products are complementary to the externally provided data products from the imaging surveys. The production line combines mass production (more than one million spectra have been generated so far), previews, and quality control.
  Lunch break
 
14:00 - 14:35 Antonella Vallenari
INAF, Padova, Italy - Track: Processing

Gaia data management will be reviewed, including data analysis, quality assurance
14:35 - 14:55 Fred Jansen
ESA/ESTEC, Noordwijk - Track: Processing/SciOps

Uwe Lammers, Rocio Guerra, Neil Cheek, Hassan Siddiqui, Fred Jansen

The European Space Agency's astrometry satellite Gaia was launched in December 2013 and started its scientific operations in July 2014 after an extended payload commissioning period. During the first year of the nominal mission the astrometric instrument alone has made around 250 Billion individual measurements which already now constitues one of the largest astronomical datasets in existence. Operations will continue for at least the next 4 years and after an extensive data processing effort an astronomical catalogue containing some 1.5 Billion celestial objects will be produced.

We describe the chosen key concepts for handling the massive amounts of daily data at the Science Operations Centre at ESAC, Madrid, their initial processing and dissemination to the other five partner processing centres. We will also illustrate some of the great challenges that the mission data poses in terms of storage, processing, monitoring, and analysis.

DOI 10.5281/zenodo.34646

14:55 - 15:15 Jose Luis Hernandez-Munoz
ESAC, Spain - Track: Processing

The Astrometric Global Iterative Solution (AGIS) scheme is the key process in the astrometric reduction of the Gaia data. It's main purpose is to generate the astrometic part of the Gaia catalogue in a way that optimally combines all 10^12 available measurements in a globally, self-consistent manner.

We will outline the technical design and chosen approaches for the distributed processing infrastructure of AGIS. An important aspect in this is the efficient reading and passing of observation data to the mathematical core algorithms.

DOI 10.5281/zenodo.34577

15:15 - 15:35 Marco Riello
University of Cambridge - Track: Processing

The DPAC Cambridge Data Processing Centre (DPCI) is responsible for the photometric calibration of the Gaia data including the low resolution spectra. The large data volume produced by Gaia (~26 billion transits/year), the complexity of its data stream and the self-calibrating approach pose unique challenges for scalability, reliability and robustness of both the software pipelines and the operations infrastructure. DPCI has been the first in DPAC to realise the potential of Hadoop and Map/Reduce and to adopt them as the core technologies for its infrastructure. This has proven a winning choice allowing DPCI unmatched processing throughput and reliability within DPAC to the point that other DPCs have started following our footsteps.

In this talk we will present the software infrastructure developed to build the distributed and scalable batch data processing system that is currently used in production at DPCI and the excellent results in terms of performance of the system.

DOI 10.5281/zenodo.34606

  Coffee break
 
16:00 - 16:35 Felix Stoehr
ESO, Garching - Track: Mission/Project

ALMA has transitioned now from the construction to the operation phase. We review the Science Data Management of ALMA including the concepts of Data Reduction, Quality Assurance as well as of the Science Archive. We also place the Science Data Management of ALMA into the larger context.
16:35 - 16:55 Peter Weilbacher
AIP, Potsdam - Track: Processing

I will present the MUSE pipeline, in terms of calibration data required, reduction steps performed, computing requirements needed. I will then highlight some unusual algorithms. I will describe changes that became necessary during commissioning and the first year of operations and how the pipeline is now integrated into operations on Paranal. I will conclude by showing some example results that were enabled by the MUSE instrument and the pipeline in the first year.
16:55 - 17:15 Isabelle Percheron
ESO, Garching - Track: Process/SciOps

The PIONIER (Precision Integrated-Optics Near-infrared Imaging ExpeRiment) at the VLT Interferometer instrument was originally a visitor instrument from IPAG (Institut de Planétologie et d’Astrophysique de Grenoble). It is now offered to the ESO community as a facility instrument. As a Visitor monde instrument, it was operated on selected nights by the instrument team/consortium, the goal is now for the Paranal staff to run and monitor the instrument as any other VLT/VLTI instrument. This is done by fully integrating PIONIER in the ESO scheme. I will present here how this was done for the data reduction and the quality assurance of the science data and their related calibrations.
17:15 - 17:35 Michele Armano
ESAC, Spain - Track: Processing

We shall outline the LPF mission in all details functional to understanding its peculiar data processing, precursor of that of gravitational waves observatories in space. Based on the analysis of time series and closer to a seismometer than to a telescope, the science of the top geodesy and free-fall ESA mission of our days shall qualify the technology to enable astronomy and cosmology in space. We will detail the mission design and planning efforts, explain how to characterize the hardware from the data analysis of the mission telemetry, open the archive products and tools that will be the mission legacy after its end.
17:35 - 17:55 Santa Martinez/Inaki Ortiz
ESAC, Spain - Track: Processing

The approach selected for BepiColombo for the processing, analysis and archiving of the science data represents a significant change with respect to previous ESA planetary missions, and the Science Ground Segment (SGS), located at ESAC, will play a key role in these activities.

This contribution will summarise the key features of the selected approach, and will describe its implementation, with focus on the following aspects:

  • The use of state-of-the-art virtualisation technology for automatic build, deployment and execution of the pipelines as independent application containers. This will allow specific software environments, and underlying hardware resources, to be isolated, scaled and accessed in a homogeneous fashion.
  • A set of core libraries under development at the SGS (e.g. telemetry decoding, PDS product generation/validation, conversion to engineering units, Java to SPICE binding, geometry computations) aimed to be reused for certain processing steps in different pipelines.

The implementation follows a quite generic and modular architecture providing a high level of flexibility and adaptability, which will allow its re-usability by future ESA planetary missions.

DOI 10.5281/zenodo.34608

THURSDAY, 26 November 2015
Chairs: Christophe Arviset, Magda Arnaboldi, Venue: ESO Auditorium (Eridanus)
Time (Garching) Speaker Topic
    Archives 9:00 - 12:50
9:00 - 9:20 Deborah Baines
ESAC, Spain - Track: Archive

ESA's European Space Astronomy Centre (ESAC) has recently launched a new version of the European Hubble Space Telescope science archive. The new and enhanced archive offers several new features, some of which are not available anywhere else.

The new web-based archive has been completely re-engineered and is now faster, more accurate and more robust than ever. Several of its unique features will be presented: the possibility of seeing the exact footprint of each observations on top of an optical all-sky image, the online visualization and inspection of FITS headers, imaging and spectral observation previews without downloading files or the possibility to search for data that has not yet been published in refereed journals.

This state-of-the-art science data archive will be the new main access point to HST data for the European astronomical community and will be enhanced in the near-future to include the Hubble Source Catalogue or other high-level data products as required.

DOI 10.5281/zenodo.34566

9:20 - 9:40 Jose Manuel Alacid
Centro de Astrobiologia, Spain - Track: Archive

The Gran Telescopio Canarias (GTC) archive is operational since November 2011. The archive, maintained by the Data Archive Unit at CAB in the framework of the Spanish Virtual Observatory project, provides access to both raw and science ready data and has been designed in compliance with the standards defined by the International Virtual Observatory Alliance (IVOA) to guarantee a high level of data accessibility and handling.

In this presentation I will describe the main capabilities the GTC archive offers to the community, in terms of functionalities and data collections, to carry out an efficient scientific exploitation of GTC data.

DOI 10.5281/zenodo.34495

9:40 - 10:00 Eva Verdugo
ESAC, Spain - Track: Archive

The Herschel mission required a Science Archive able to serve data to very different users: The own Data Analysis Software (both Pipeline and Interactive Analysis), the consortia of the different instruments and the scientific community. At the same time, the KP consortia were committed to deliver to the Herschel Science Centre,  the processed products corresponding to the data obtained as part of their Science Demonstration Phase and the Herschel Archive should include the capability to store and deliver them. I will explain how the current Herschel Science Archive is designed to cover all these requirements.
10:00 - 10:20 Ivan Zolotukhin
IRAP, Toulouse - Track: Archive

Like it is the case for many large projects, XMM-Newton data have been used by the community to produce many valuable higher level data products. However, even after 15 years of the successful mission operation, the potential of these data is not yet fully uncovered, mostly due to the logistical and data management issues. We present a web application, http://xmm-catalog.irap.omp.eu, to highlight an idea that existing public high level data collections generate significant added research value when organized and exposed properly. Several application features such as access to the all-time XMM-Newton photon database and online fitting of extracted sources spectra were never available before. In this talk we share best practices we worked out during the development of this website and discuss their potential use for other large projects generating astrophysical data.
10:20 - 10:40 Xavier Dupac
ESAC, Spain - Track: Archive/Project

The Planck Collaboration has released in 2015 their second major dataset through the Planck Legacy Archive (PLA).

It includes cosmological, Extragalactic and Galactic science data in temperature (intensity) and polarization. Full-sky maps are provided with unprecedented angular resolution and sensitivity, together with a large number of ancillary maps, catalogues (generic, SZ clusters and Galactic cold clumps), time-ordered data and other information. The extensive cosmological likelihood package allows cosmologists to fully explore the plausible parameters of the Universe.

A new web-based PLA user interface is made public since Dec. 2014, allowing easier and faster access to all Planck data, and replacing the previous Java-based software. Numerous additional improvements to the PLA are also being developed through the so-called PLA Added-Value Interface, making use of an external contract with the Planetek Hellas and Expert Analytics software companies. This will allow users to process time-ordered data into sky maps, separate astrophysical components in existing maps, simulate the microwave and infrared sky through the Planck Sky Model, and use a number of other functionalities.

DOI 10.5281/zenodo.34639

  Coffee break
 
11:15 - 11:50 Marc Sauvage
11:50 - 12:10 Gijs Verdoes Kleijn
University of Groningen - Track: Mission/Project

The E-ELT First-light instrument MICADO will explore new  parameter space in terms of precision astrometry, photometry and spectroscopy. This provides challenges for the data handling and reduction to ensure MICADO takes the observational capabilities of the AO-assisted E-ELT towards its limits. Our plan is to achieve this via iterative improvement of data quality. We present implications for the science data management system and pipelines. The iterative feedback loop between improving science data quality and improving the instrument, telescope and atmospheric calibration leads to an ever better observational model. For this reason the MICADO instrument data simulator plays a vital role in the design of pipelines and science data management system. It should provide realistic data for all science cases to guide the trade-off between calibration via hardware and software. It can then also guide how to embed the observational model in the data management system.  We discuss our current models of instrument, E-ELT and sky background and the type of science data management system that supports their continuous monitoring and iterative improvement.
12:10 - 12:30 Tanya Lim
ESAC, Spain - Track: Archive

ExoMars 2016 will be the first operational ESA mission to use PDS4 the new version of the NASA’s Planetary Data System (PDS) standards. The data produced will be housed in the new Planetary Science Archive (PSA) which is currently under development at ESAC. This talk will introduce the ExoMars 2016 mission and its payload. The adaptation of the PDS4 standard for ExoMars 2016 and other future missions in the PSA will be discussed along with a progress report on the new PSA development.
12:30 - 12:50 Juan Gonzalez-Nunez
ESAC, Spain - Track: Archive/VO

The ESDC (ESAC Science Data Center) is one of the active members of the IVOA (International Virtual Observatory Alliance) that have defined a set of standards, libraries and concepts that allows to create flexible,scalable and interoperable architectures on the data archives development.

In the case of astronomy science that involves the use of big catalogues, as in Gaia or Euclid, TAP, UWS and VOSpace standards can be used to create an architecture that allows the explotation of this valuable data from the community. Also, new challenges arise like the implementation of the new paradigm "move code close to the data", what can be partially obtained by the extension of the protocols (TAP+, UWS+, etc) or the languages (ADQL).

We explain how we have used VO standards and libraries for the Gaia Archive that, not only have producing an open and interoperable archive but, also, minimizing the developement on certain areas. Also we will explain how we have extended these protocols and the future plans.

DOI 10.5281/zenodo.34569

  Lunch break
 
    Data Centres 14:00 - 17:55
14:00 - 14:35 Mike Irwin
University of Cambridge - Track: Datacenter

 In this talk I will review the data management facilities  at CASU for handling large scale ground-based imaging and spectroscopic surveys. The overarching principle for all science data processing at CASU is to provide an end-to-end system that attempts to deliver fully calibrated optimally extracted data products ready for science use.  The talk will outline our progress in achieving this and how end users visualize the state-of-play of the data processing and interact with the final products via our internal data repository. 
14:35 - 14:55 Enrique Solano
CAB/INTA, Villanueva, Spain - Track: Datacenter

The Centro de Astrobiología (CAB) Data Centre is the most important astronomical data centre managed by a Spanish institution. Among others, it contains the Gran Telescopio Canarias (GTC) and the Calar Alto (CAHA) scientific archives. Nevertheless, our activities go well beyond data curation. Generation of high level data products (reduced datasets, catalogues,...), knowledge transfer to other Spanish data centres, development of tools to publish astronomical data in VO-compliant archives and services, development of data mining and analysis tools for an optimum scientific exploitation of our data collections and collaboration with scientific groups with research lines using CAB archive data are some of the topics that will be described in this presentation.
14:55 - 15:15 Wolfram Freudling
ESO, Garching - Track: Processing

Producing science data products that can be used to extract science is the ultimate objective of astronomical observation. The complexity of modern instruments require highly specialized algorithms for data organization and data reduction. Data visualization and user interaction, both to fine tune individual algorithms and to modify the data flow itself, are essential for the production of science grade products that fully exploits the potential of the raw data.

ESO has a long history of providing specialized algorithms called recipes for each of its instruments. ESOREFLEX is an environment to deliver complete data reduction workflows that include these recipes to the users. These workflows encapsulate the best practise data reduction for the data from a particular instrument, and at the same can easily be modified by the user. ESOREFLEX includes systems for automatic data organization and visualization, interaction with recipes, and the exploration of the provenance tree of intermediate and final data products. ESOREFLEX allows ESO to deliver recipes that are used in its unsupervised operational pipelines to [...]

DOI 10.5281/zenodo.34640

15:15 - 15:50 Giovanni Lamanna
CNRS/LAPP, Anecy - Track: Mission/Project

Astronomy and Astroparticle Physics domains are experiencing a deluge of data with the next generation of facilities prioritised in the European Strategy Forum on Research Infrastructures (ESFRI), such as SKA, CTA, KM3Net and with other world-class projects, namely LSST, EUCLID, EGO, etc. The new ASTERICS-H2020 project brings together the concerned scientific communities in Europe to work together to find common solutions to their Big Data challenges, their interoperability, and their data access. The presentation will highlight these new challenges and the work being undertaken also in cooperation with e-infrastructures in Europe.
  Coffee break
 
16:20 - 16:55 Mark Allen
CDS, Strasbourg - Track: Datacenter

The Centre de Donnees de Strasbourg (CDS) is a reference data centre for Astronomy. The CDS services; SIMBAD, Vizier, Aladin and X-Match, provide added value to scientific content in order to support the astronomy research community. Data and information are curated from refereed journals, major surveys, observatories and missions with a strong emphasis on maintaining a high level of quality. The current status and plans of the CDS will be presented, highlighting how the recent innovations of the HiPS (Hierarchical Progressive surveys) and MOC (Multi-Order Coverage map) systems enable the visualisation of hundreds of surveys and data sets, and brings new levels of interoperability between catalogues, surveys images and data cubes.
16:55 - 17:15 Nicholas Cross
Institute of Astronomy, Edinburgh - Track: Datacenter

The Wide-Field Astronomy Unit in Edinburgh (WFAU) specializes in building and operating survey science archives. Imaging surveys carried out on UKIRT/WFCAM, VISTA/VIRCAM and the VST/OmegaCAM account for most of our current work, although we also operate the archive for the Gaia-ESO Spectroscopic survey.
17:15 - 17:35 Joerg Retzlaff
ESO, Garching - Track: Archive

Phase 3 denotes the process of preparation, submission, validation and ingestion of science data products for storage in the ESO Science Archive Facility and subsequent publication to the scientific community. We will review more than four years of Phase 3 operations at ESO and we will discuss the future evolution of the Phase 3 system.
17:35 - 17:55 Nausicaa Delmotte
ESO, Garching - Track: Archive

Data validation is an essential step of the Phase 3 process at ESO. It ensures a homogeneous and consistent archive with well traceable data products, to the benefits of archive users. The many aspects of the Phase 3 validation will be described in the presentation.
FRIDAY, 27 November 2015
Chairs: Michael Sterzik, Venue: ESO Auditorium (Eridanus)
Time (Garching) Speaker Topic
    Linking Data 9:00 - 12:15
9:00 - 9:35 David Schade
9:35 - 10:10 Edwin Valentijn
University of Groningen - Track: Datacenter

The AstroWise information system is operational for  the production of the results of a number  astronomical survey programmes with OmegaCAM@VST and MUSE@VLT. In different forms it has also been applied to the Lofar radiotelescope, life science projects and business applications.  I will discuss the common "data federation"aspects of these projects, and the data federation aspects of the Euclid Archive System.
10:10 - 10:30 Severin Gaudet
CADC, Canada - Track: Datalinks?/VO

Over the past six years, the CADC has moved beyond the astronomy archive data centre to a multi-service system for the community. This evolution is based on two major initiatives. The first is the adoption of International Virtual Observatory Alliance (IVOA) standards in both the system and data architecture of the CADC, including a common characterization data model. The second is the Canadian Advanced Network for Astronomical Research (CANFAR), a digital infrastructure combining the Canadian national research network (CANARIE), cloud processing and storage resources (Compute Canada) and a data centre (Canadian Astronomy Data Centre) into a unified ecosystem for storage and processing for the astronomy community. This talk will describe the architecture and integration of IVOA and CANFAR services into CADC operations, the operational experiences, the lessons learned and future directions.
  Coffee break
 
11:00 - 11:35 Francoise Genova
CDS, Strasbourg - Track: Datalinks

European Virtual Observatory (VO) activities have been coordinated by a series of projects funded by the European Commission. Three pillar were identified: support to the data providers for implementation of their data in the VO framework; support to the astronomical community for their usage of VO-enabled data and tools; technological work for updating the VO framework of interoperability standards and tools. A new phase is beginning with the ASTERICS cluster project. ASTERICS Work Package "Data Access, Discovery and Interoperability" aims at making the data from the ESFRI projects and their pathfinders available for discovery and usage, interoperable in the VO framework and accessible with VO-enabled common tools. VO teams and representatives of ESFRI and pathfinder projects and of EGO/VIRGO are engaged together in the Work Package. ESO is associated to the project which is also working closely with ESA. The three pillars identified for coordinating Europaen VO activities are tackled.
11:35 - 11:55 Johannes Reetz
11:55 - 12:15 Uta Grothkopf
ESO, Garching - Track: Datalinks

The ESO Telescope Bibliography (telbib) is a database of refereed papers published by the ESO users community. It links data in the ESO Science Archive with the published literature, and vice versa. Developed and maintained by the ESO library, telbib also provides insights into the organization's research output and impact as measured through bibliometric studies.

Curating telbib is a multi-step process that involves extensive tagging of the database records. Based on selected use cases, this talk will explain how the rich metadata provide parameters for reports and statistics in order to investigate the performance of ESO’s facilities and to understand trends and developments in the publishing behaviour of the user community.

DOI 10.5281/zenodo.34616

12:15 Final Discussion and Concluding Remarks
13:00  
Do not drink and drive
ESO, Garching - Track: Strategy

TBD