P3T

Pierre Fernique Pierre.Fernique at astro.unistra.fr
Thu May 23 18:37:45 CEST 2024


Hi all,

I must say that I share many of the questions raised by Mark. I 
understand and encourage this P3T OpenAPI initiative, which should help 
reduce inconsistencies in our IVOA APIs. I'm more dubious about the 
supposed development help provided by automatic code generation. And, 
I'm very concerned about the risks on the backward compatibility. This 
is one of the reasons for IVOA's long-term success. The more robust and 
durable the interoperability is, the better IVOA will prosper. The 
argument of using "modern" programming techniques doesn't seem very 
solid (or I'm now too old to buy it). Twenty years ago, SOAP (industrial 
& modern solution at this time) was presented (and implemented) with the 
same justification. And Python is as old as Perl ; FITS is older than me 😉

It seems to me that the proposal to provide an OpenAPI description of 
our standardized APIs should be seen in the same spirit as the BNF 
grammars we now provide some times to normatively describe any syntaxes 
we use. So not a "transition" on the IVOA standardization process, but a 
desirable - if not required - addition for an unambiguous, 
machine-readable description. With this in mind, I think it would still 
be important not to lose human readers by simply replacing the normative 
part for a JSON|YAML document. I'd see it more as a complement, an 
appendix, just as we provide for decades the VOTable XSD schema (and DTD 
before).

I'd like to add that the presentation given at the P3T session may have 
given the impression that all IVOA standards and tools would eventually 
be impacted (modification of the standardization and validation process, 
etc.). However, we have defined and use many standards that are not just 
HTTP APIs. One of IVOA's strengths is that it has also defined data 
formats (VOTable, HiPS, MOC), vocabularies (UCD, units), grammars 
(STC-S, ADQL), data models and more. And the coding simplification hoped 
for by code generators for APIs cannot replace the need to take these 
formats/vocabularies/grammars/datamodels into account, and by 
experience, the API part is often the easiest to code.

Concerning the adaption of Aladin Desktop, fortunately I don't yet see 
the real need. The work required to take over all of Aladin Desktop's 
client IVOA APIs is difficult to quantify, and most certainly very 
heavy, risky, and without immediate benefit, even risking introducing 
inconveniences. A client such as Aladin " deals " with uncertain/partial 
implementations of a standard. He won't necessary throw errors, but will 
do the best he can. And for the benefit of the user. But again, I don't 
see the need, since the implementation effort has already been made. Why 
do this a second time. It would only be for new standards, and if 
automatic code generation actually brings the expected benefit.

My few comments after a short night (with a still very partial knowledge 
of what OpenAPI Specification would allow).
Have a nice day/night
Pierre Fernique

Le 22/05/2024 à 22:16, Mark Taylor via dal a écrit :
> Dear colleagues,
>
> having let the P3T session yesterday sink in I wanted to make a few
> comments.
>
> [TL;DR: I've got reservations, but I'm not going to throw rocks in the way]
>
> This proposal will lead to some fragmentation of the VO ecosystem.
>   Up till now we have, with a few exceptions
> (e.g. PLASTIC->SAMP, SIA1->SIA2), been extremely conservative
> about breaking changes to the standards, e.g. Cone Search responses
> are still required to include UCD1s to the exclusion of UCD1+s that
> replaced UCD1 in around 2006.  The result has been that to a pretty
> large extent all the client and server software that has been written
> to date continues to interoperate.  For long-lived standards such
> as TAP and SCS new clients work with old services and old clients
> work with new services, though enhancements introduced in later
> versions of standards obviously are not available in those cases.
> As far as it goes, I'd say that is a pretty good thing.
>
> The current proposal lobs a cannonball into that approach so that
> there is no chance of client/server code written for existing
> standards interoperating as it stands with code implementing
> the new versions of those standards.  Similarly, client/server
> code written to target the new versions will not interoperate
> with existing implementations.  The scope is currently TAP and UWS
> but it looks like the thin end of a sizeable wedge;
> phase 3 will target other standards, and several comments made
> during the presentations seemed to indicate that we can expect
> another similar round of breaking changes when OpenAPI falls
> out of fashion in a few years time, and so on.
>
> The no-legacy-code-left-behind vs. move-fast-and-break-things
> difference between approaches is reminiscent to me of a comparison
> between development of the Java and Python platforms/languages;
> v1.1 of the Java application TOPCAT from 2004 runs happily under
> Java 21 from 2021, but running older python code on recent
> python platforms is typically much more problematic.
> Neither approach is absurd; each has its pros and cons.
>
> Gregory's presentation yesterday acknowledges the issue but I would
> say understates the impact a little bit.  Even assuming that the
> major clients are adapted to work with both old and new APIs,
> the existing "Informal client" sector will be unable to talk
> to new services, and existing services will be inaccessible
> from any new OpenAPI-based clients; Joshua was keen to point
> out that such new clients are now going to be easy to develop.
>
> I can see why the proposed changes appeal to large, well funded service
> providers.  But there are downsides from the point of view of legacy
> services; newly developed clients will not talk to them and the
> hurdle to take advantage of incremental changes in the standards
> becomes higher.  This model for standards development may come
> back to bite those projects which are currently well-resourced
> if they transition to maintenance-only mode in the future.
>
> So, I'm wary of this way forward and I think people should be aware
> of the impacts we can expect.  But, maybe the benefits outweigh
> the costs.  Judging from the institutional involvement in the P3T
> and from most of the audience response at yesterday's session
> it looks like the prevailing opinion is that this is the way
> we should go, and that it is going to happen.
>
> I wonder whether there is a middle way: rewrite the standards
> in terms of OpenAPI but do what we can to make them describe
> existing service behaviour, minimising breaking changes.
> There are two things being proposed here, and they are separable
> to a degree: firstly providing machine-readable (OpenAPI)
> definitions of the services, and secondly transitioning those
> services to behave in a more modern way.
> Page 8 of Joshua's presentation
> (https://wiki.ivoa.net/internal/IVOA/InterOpMay2024P3T/dal-openapi-tech-overview.pdf)
> mentions one thing that can't be done in OpenAPI: case-sensitivity
> of parameters.  The other protocol changes described there are
> more about how people would do things given a blank sheet of paper
> than simply enabling machine-readable interface definitions.
> Use of x-www-form-urlencoded parameters is said to be vulnerable to
> CSRF, but there was some discussion on zoom about whether this is
> really the case, I haven't looked into the details.
> Case sensitivity we could probably fudge if we wanted to
> (make a breaking but low-impact change to standards requiring no
> modification for most code that uses uppercase anyway).
> If the issue is really just about OpenAPI compatibility and
> machine-readable interface definitions then perhaps it could be
> done with significantly less impact than what's being suggested.
> If on the other hand the appetite is really to take the
> opportunity to update lots of things without regard to backward
> compability, that's a different matter.  But we should be clear
> about the goals.
>
> Whatever we do decide to do, I want the VO to work as well as possible
> for users, so I'd volunteer to prototype client code at an early
> stage in topcat and stilts.  I will mention another class of client
> that will need to be modified to work in dual mode: validators.
> The taplint validator that forms part of stilts is a complex
> tool widely used by TAP service deployers to check that their
> services are operating according to the specs (I'd be happy for
> others to develop a TAP validator, but nobody has), and if it
> doesn't work for OpenAPI-based services I predict service
> compliance and quality will drop significantly.
> It is true that off-the-shelf OpenAPI validators can be used to
> do part of the job of service validation in the new picture,
> but there's a lot that goes on in taplint that I don't think can be
> covered simply on the basis of understanding the OpenAPI contract.
>
> I think this is going to require a significant amount of development
> effort from client authors like me, but how much will depend on the
> details of the proposal, so I look forward to the development and
> results of the proposed Phase 1.
>
> Mark
>
> --
> Mark Taylor  Astronomical Programmer  Physics, Bristol University, UK
> m.b.taylor at bristol.ac.uk          https://www.star.bristol.ac.uk/mbt/


More information about the dal mailing list