VOEvent Update: JSON and data models
Robert B. Denny
rdenny at dc3.com
Mon Oct 16 23:17:04 CEST 2017
Hello friends --
As some of you probably recall, I walked away from all of this back at the 2012
LSST All Hands meeting. Meanwhile recently some interest has sprung up wherein
some real-time amateur-professional cooperation is happening and we're using the
VOEvent network that we've had running for years. In addition we're hoping to
get the AAVSO on board. We have met twice with the director and their tech guy
regarding using VOEvent to alert for urgent transient events, exoplanet data,
and other data collection requests, again from professionals. So, it appears I
have a bit of interest in this.
I read Tim and John's responses and in general I go along with them. My thoughts
are:
* What's the strong engineering case for changing to or adding JSON (it's
coolness is not an engineering reason, and I like it too)? Why are we doing it?
* How will you prevent trash from entering the system? Early in VOEvent, I had
to implement a "pre-patcher" to fix errors in VOEvent messages from various
sources which were producing the XML by directly writing angle brackets
(syntax), including "custom" elements (structure), and using incorrect
date/time and/or coordinate forma (semantics). All of this can be avoided
(and is avoided) by using schema-driven generation and validation via a
higher level API. I know you remember my shrill and annoying speech about
this at one of the WG meetings (Santa Cruz?) ha ha. What is the technical
maturity of JSON schema and especially the tools for generating and
validating JSON. Can we enforce the "rule of law" for JSON and thus
successfully require correct data at the entry point of the system?
* Converting from the schema-compliant XML to JSON looks technically risky.
* What will be the new transport mechanism? Are we going to implement
hub-and-spoke? Should existing brokers (which currently enforce schema
validation to keep trash from circulating) be augmented to transport the
JSON? Should we just use Twitter to send out URIs to the actual messages
that sit on some big cloud repository? PS I was delighted to see the
original transport spec accepted as a standard earlier this year.
_I like the data model group idea_. I need to digest the implications of that.
The above projects that our little group of pro/am people are doing could
benefit from it. We are already _perverting_ VOEvent via Params, turning it into
a request of sorts rather than just an alert. It's all quite ad-hoc, with our
approach being to see what's really needed, and which concepts best work in the
real world. We're not pouring any concrete. As you folks know I am a bit hostile
to "design and decree" anyway ha ha.
-- Bob
> Dear Colleagues
> We respectfully submit an IVOA Note about some proposed improvements to the VOEvent standard. We would appreciate some discussion on this email address and at the upcoming meeting in Chile.
> Thank you for your attention
> Roy Williams, Scott Barthelmy, Eric Bellm, Matthew Graham, Rob Seaman
>
> ==================
>
> VOEvent Update: JSON and data models
> Author(s):
> Roy Williams, Scott Barthelmy, Eric Bellm, Matthew Graham, Rob Seaman
>
> UTL:
> http://ivoa.net/documents/Notes/VOEventJSON/index.html
>
> Abstract
> We propose an extension of the VOEvent format, to translate the packet from XML to JSON – with no semantic change. We also propose to use the VOEvent data model system to define three data-model Groups: “Light Curve”, “Associated Sources”, and “Followup Imaging”. This straightforward update of VOEvent simplifies the syntax and provides simple, standard representation of common astronomical datasets.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.ivoa.net/pipermail/voevent/attachments/20171016/88233cfa/attachment.html>
More information about the voevent
mailing list