Time Series Cube DM - IVOA Note
Pierre Fernique
Pierre.Fernique at astro.unistra.fr
Mon Apr 3 08:12:43 CEST 2017
Hi Petr,
Thanks to recall us all these basis points.
In the framework of the next release of Aladin 10, we recently
implemented (or tried to implement) a lot of VO protocols (TAP,
VOSpace, SIAv2, Datalink, SODA, MOC, HiPS). The results is both
enthusiasming and disappointing. I'm enthusiasm because we can now
access to a lot of services and data, and a special "thanks" to RegTAP
(+ MOC) which has populated the VO registry in a real usable way. But
I'm also disappointed of the complexity required on the client for
implementing most of these protocols, and probably not always fully
justified.
It is indubitably true that writing a VO client tends to be more and
more complex. The "S" in "Simple Cone Search", "Simple Image Access",
"Simple Application Message..." has been forgotten last years.
Consequently, the usage of dedicated libraries is often a pre-requise.
In my point of view, it is not a so good criteria for a standard. It
just means that it is (too?) difficult to start from scratch, from the
standard document specification, without having 6 months FTE for the work.
Additionally, good dedicated libraries are not so easy to find. If you
just consider VOTable, this standard is became really complex - probably
too complex for most of clients (multi RESOURCE, GROUP/ref, BINARY2,
FITS stream, base64-UTF16, remote data with expiration date...), and it
is difficult (impossible?) to find a parser/lib supporting all features
described in the last specification even after 8 years (VOTable 1.3 has
been released in 2009). And for instance, recent Astropy VOTable parser
does not manage (crash) GROUP tag yet. May be, we want to put too many
things in this protocol (and VODML future additions are not yet
arrived). If the complexity of VOTable is still increasing, the real
risk is to see appear a simpler JSON alternative definitively easier to
implement for the client - but that's also meaning that we will lost all
the benefits of the VO metadata agreements that VOTable had successfully
provided inside IVOA since 15 years.
I don't say that that the IVOA has not to manage "C"omplex metadata, I
just totally agree that "we may not (try to) cover everything". It would
be already no so bad to choose "S"imple solutions for the 80% of
"S"imple issues.
Best regards,
Pierre
Le 22/03/2017 à 19:07, Petr Skoda a écrit :
>
> Dear All,
>
> I have just returned from a holiday (without internet) and
> have seen what interesting discussion is on-going.
>
> As I was at the begining of this effort, which gradually was
> implemented during Jiri's master thesis and implemented in SPLAT-VO
> and DACHS, I want to make short comments the the whole philosophy.
>
> The VO needed urgently standard for time series, a wealth of light
> curves as well as other time series (e.g. gravity wave) needs to be
> accessed .
> We have proved the proposed concept works well and allows to fulfill
> most of uses cases we state (and in fact all which VO was able to
> imagine as summarised in Enrique's summary).
>
> The new use case of Laurent is covered as well
> so far we were just searching light curves drom Danish Dk154 telescope
> as I have demonstrated multiple times using simple "cone search" in
> circle. Recent Obscore queries allows general ADQL shape (but very
> slowly as tested) We have crossmatched e.g. Landoldt cataloges from
> Visier directly wit the service implementing the presented standard in
> DACHS using
> TOPCAT and plotting light curves in (recently modified) SPLAT-VO as
> well as TOPCAT itself.
>
> There are still some client adaptation necessary to have it smooth.
> E.g. in TOPCAT would be nice to have feature that we already have
> discussed several times - which basically makes this:
>
> If there is a accref in a displayed table you display this link and
> on click it makes another table downloading this link.
>
> So far it must be done by copy this accref link to clipboard and
> using this link in opening new table.
>
> This makes a magic (which is a core of our proposal) that you see the
> light curve in TOPCAT (you can plot it, filter, modify etc ...) as the
> accref is in fact just pointer to the TABLE where the time series is
> encoded using Jiri's suggestion.
>
> You may send the table from SPLAT-VO as well as from ALADIN.
> So the whole light curve is just a pointer to file (or in our case to
> the on-the fly generated query in database of photometric
> measurements) which is a light curve (time series) represented as a
> TABLE.
>
> There is no need for special core data model with strange
> interpretation to fulfill almost all functionality.
>
> I am afraid that the discussion I have briefly looked into (sorry for
> not reading in details) is running in a way which have killed many
> good attempts in IVOA - the fundamentalist's approch strictly based on
> complex data models which cannot incorporate all predictable
> application usage.
> Strong words like "rewrite VO-DML" or sympathy to GOD-OBJECT is
> scaring me.
>
> If we admit the work of IVOA so far was not based of clear design
> concept (and the number of individual and inconsistent protocols shows
> this - e.g.
> look how simple search in cone on sky looks in SSAP and SCS - same
> semantics RADIUS , COORDINATES ....) we still must see that in 20
> years this inconsistence was still fruitful - look at number of
> publications using just Xmatching of catalogues.
>
> The discussion here has shown the way how to design future VO
> standards, but it is terribly complicated in sense of previous way of
> thinking. I am afraid that some of you see in our small time series
> solution the way how to rewrite whole IVOA from scratch.
> Sure the datacubes are new challenge complicated as the instruments
> providing it.
>
> But we should deliver something useful to the community despite the
> fact of inconsistencies in different SCHEMAS ...
>
>> From the point of view of client developers :
>
> Please note that the propagation of strength of VO stands and fails
> with client applications - and in fact we have only few strong clients
> that everybody knows. So IMHO we must adopt the server side to the
> client writers (in fact who is going to write a new VO client?).
>
> In astronomy the attempt to full standardisation despite the IAU
> effort always fails and people are naming object arbitrarily, using
> strange units, special variables etc ...
>
> In using the 80/20 rule - we may not cover everything ...
>
> So please keep this in mind - the time series is crucial part of
> astronomical research and having very simple applications to work with
> them despite uncomplete models would greatly improve the IVOA
> reputation in wide community (frankly - it is not terribly high ).
>
> Concerning the coordinate system issues.
>
> For most of X-matching the rough RA,DEC is just fine to be able to
> compare with catalogues in Vizier.
>
> However many light curves are showing behaviour of objects which are
> hardly identified by coordinates .... You must use TARGET NAMES to
> clearly identify them. It is paradox , that the most interesting time
> series cannot give coordinates:
>
> some use cases for this:
>
> 1) exoplanets - top secret of PIs is their identification
> they will not publish it but what to have many light curves to comape
> using just names a Tres1, Kepler 15, Myterestrialsurvey 5 etc ....
>
> 2) optical conterparts of gravitational wave - very uncertain and in
> fact unknown before the precise modelling is done - after long analysis..
>
>
> 3) time series of spectral behaviour of variable stars in general.
> The telescopes points to given coordinates but at this there are 2
> separated spectra extracted from the same CCD frame. Both objects
> have the same coordinates from the telescope and so the pipeline will
> write it in header. Some manual effort and hacking headers is
> needed. - which is hardly done.
>
> Most amateur astronomers just gives table of intensity dependence on
> time and say this is my target - XYZ - I do not care where it is -
> I have used finding chart and I am sure it is what I want. In their
> logs you will find only the one of many names of star - seldom
> coordinates...
>
> An interesting job may be to identify the object which was really
> observed ;-)
>
>
> So the plotter mentioned earlier does not need to solve the
> coordinates at all !
>
> Moreover using the current implementation I was able to use SAMP and
> send our lightcurve to Period04 and create e.g periodogram.
>
> So all the basic use cases I can imagine can be already fulfilled.
>
> Take this into account when creating the SuperGod objects or
> rewritting the VO-DML !
>
> I hope I will have time in Shanghai to convince you .
>
> Best regards,
>
> Petr Skoda
>
>
>
>
>
>
>
> On Wed, 22 Mar 2017, Laurent MICHEL wrote:
>> - Looking for stars with more than N photometric points 5sigma higher
>> than the mean value.
>
>> An optical counterpart is not known. How can one get lightcurves for
>> all objects in the error-ellipse to look for variability and thus
>> possible counterparts to the blazar?
>>
>> - Plot the lightcurves of all SN Ia events together.
>
>
> *************************************************************************
> * Petr Skoda Phone : +420-323-649201, ext. 361 *
> * Stellar Department +420-323-620361 *
> * Astronomical Institute CAS Fax : +420-323-620250 *
> * 251 65 Ondrejov e-mail: skoda at sunstel.asu.cas.cz *
> * Czech Republic skoda at asu.cas.cz *
> *************************************************************************
More information about the dm
mailing list