The State of VOEvent

Bob Denny rdenny at dc3.com
Fri Jun 13 20:55:58 PDT 2008


This is probably going to get me into some hot water, but what the heck... I'm
an old school engineer from the days when design reviews were battlegrounds, and
once completed, everyone went to the pub.

Rob S:
> 1) to reliably authenticate VOEvent packets, and

Easy if the XML isn't touched. See below...

> 2) to permit VOEvent usage that benefits from modern, evolving, XML tools. 
> [...] Intervening links in the workflow may well reformat the XML, making the
>  packet different than the one originally signed.  How then do we deal with 
> this as simply and reliably as possible?

Why? What tools?

Are we setting ourselves up for a world of hurt by straying from the K.I.S.S.
principle? I see a disconnect between theory and practice already:

(1) VOEvent 1.1 includes STC. No one has implemented STC. What is
implemented/used is based on informal gentlemens' agreements and is a tiny
subset of it. Also I strongly suspect that there is some of VOEvent (the outer
schema) that is not implemented/used in reality.

(2) Item (1) notwithstanding, there are multiple violations of the VOEvent 1.1
schema in the actual messages (that use the squishy subset). See

    http://www.ivoa.net/forum/voevent/0803/0681.htm

Given the current real-world state of VOEvent traffic vis a vis the schema(s), I
can't see us ever getting to a point where things are "rigorous" unless we admit
to ourselves that what we have as a specification is too complex, cannot likely
be implemented in reasonable time, and contains far too many points of failure
compared to the perceived benefit of the complexity.

Instead, if I were on point, I'd say that we need to define a schema that
reflects what's actually being used today (minus the errors in (2)), then work
to get everyone to conform at least to that (again ref (2)). This would mean
disconnecting VOEvent from STC, which is probably politically incorrect but I
think engineering correct.

Starting with that, someone (one person, not a committee!) with thick skin
should be tasked with brokering the additions to the schema going forward. No
"it would be nice if" stuff is a starting point. When more than one organization
really needs something, and they can show a CLEAR and PRESENT use-case, it gets
added to the list for the next version, whenever that happens. Discussion ensues.

Clearly, this sort of evolution REQUIRES advanced XML tools which can generate
an object-model parser from the schema. THIS is where we should be looking for
new technology. When the schema changes to accommodate additional DEMONSTRATED
needs, people run their tools against the new schema, generate new object model
code, and if we're careful when adding new things, the object model already in
use is compatible, new things just appear as extensions, and their code still
works. Later they can add code to access the new features if needed. Compare
with angle-bracket-picking, XMLDOM, or XPATH code that's hard-wired.

OK, rant over... Getting back to your note:

> There is some trade-off between the reliability of signing and the complexity
>  and efficiency of generating and evaluating signatures.  If a small fraction
>  of packets are unsignable for esoteric reasons, that may be more acceptable 
> than adopting a highly complex technology to render these few technically 
> signable, but calling into question pragmatic VOEvent authentication for all 
> packets.

I don't understand this. A digital signature, using today's methods, on some
bucket of bytes less than a few megabytes should be so reliable that the
probability that it can be spoofed is less than the inverse of the number of
particles in the universe. Why would anything we plan to deal with be
"unsignable"? From what I know, anything can be signed. If that "anything" is
really huge (terabytes) then the probability of more than one set of bytes being
validated with a given signature starts to become less than astronomical, but
are we really concerned with that?

Maybe I'm missing something... And that may be the implicit assumption that
VOEvent messages will be "restated" along the way from sender to receiver.

Steve Allen in http://www.ivoa.net/forum/voevent/0806/0689.htm:
> Can all parties agree never to use any XML tools that reformat the elements 
> in ways that are content idempotent but bitwise different? Are there tools 
> that make that sort of guarantee?

What's so hard about keeping the original XML? Sure, someone might want to parse
it so as to make it accessible via an object model, but why would anyone want to
then turn their object model back into XML and send it along? Save the original
and send THAT along. It's the original after all, signature and all (if signed).

Rob S (again):
> And surely we will vet both signed and unsigned event streams for proper
> convenance through the VOEvent network as it evolves?

Really? Can't we instead insist on proper convenance from originators? Who are
the perceived slipshod originators? This seems like one of those "of course"
things that on further thought might be "well..."

> Any intervening relay that corrupts a packet should be corrected, not force
> all packets to be rewritten.

Amen. And a "relay" shouldn't be interested in the content anyway, so what
reason should it have to make ANY change?

I see this as analogous to "always keep all of your original data". If we stick
to that, we can sign the messages the simple way, and ignore canonicalization too.

  -- Bob

PS: If the above is offensive or annoying, I apologize in advance. I'm still
feeling like an outsider, particularly after not being at Trieste. I see a great
future for VOEvents. I just want it to be practical, usable, stable, elegant,
and of course successful.






More information about the voevent mailing list