Alternate proposal for digital signatures

Bob Denny rdenny at dc3.com
Tue Mar 18 20:49:50 PDT 2008


Rob:
> The point of cost requirements isn't just whether one  
> community can afford more protection, but rather, what fraction of the  
> total budget available should be assigned to one bit of functionality  
> versus another.

Perfectly said. The rest is in concert...

> We need to explore the architecture of VOEvent authentication before  
> focussing on the technology choice(s).

True also, I am guilty of rushing to the solution, but that's my nature - "Get
something out there and start learning." There's nothing like having a real
system to start from. The proposed scheme is so simple and cheap it could be
thrown away before it gets too pervasive, with little hand-wringing.

> The roles of author and publisher are separate by design.  A VOEvent  
> packet is truly a publication in both the computer science and  
> academic sense.  The author is responsible for content - the publisher  
> for the integrity of the packet.

I see it as a publication by the author, using electronic printing presses that
are designed and operated by printers (the computer techs) -- the manuscript is
turned into printed pages (a form change only) -- the author's data is turned
into a VOEvent packet (a form change only). I see the two as analogous.

> The/a frequent use case would have a  
> publisher supplying an authoring application to some class of authors  
> - for instance, AAVSO as publisher encouraging amateur astronomers to  
> download a desktop or browser app that is preconfigured to transport  
> content from filling in a form to a known AAVSO-controlled VOEvent  
> broker.

Understood. But I still have a hard time with the publisher altering the
content, and I don't see that in your use-case. The tool could (should!) produce
a signed packet, thus the author would know that those computer guys can't
possible alter his/her data.

> A transport signature may be attached between the client and the  
> broker to protect the initial content - if the client runs in a web  
> browser, browser level security would be an obvious choice.  This  
> initial content will not typically reflect a VOEvent compliant packet,  
> however, because it is the publisher who assigns an ID and guarantees  
> conformance to the schema.

I look at this as one instance of "the tool" - the fact that the tool is split
into browser, server, etc., is just internal design. The ultimate result would
be the author's data would end up in a VOEvent packet, signed by the author. The
publisher's tools should make this possible to a degree of assurance acceptable
to the author.

> The first "official" signature is therefore only possible at the point  
> of publication.  It seems to me that this first VOEvent-compliant  
> signature would supersede any purely transport authentication used  
> between author(s) and publisher. [...]

I agree. And I see no requirement that brokers sign/unsign - they are free to
use any scheme they want to assure that they "do it perfectly or raise an error"
which is what they must do. It could be a trailing SHA hash. There is no strict
requirement for authentication, just integrity.

> Not partial messages (perhaps not even VOEvent-like in format), rather  
> the members create content and the AAVSO publishes the first fully  
> VOEvent-compliant packet with unique ID and persistent signature.

OK, it means that the AAVSO is bound not to publish altered data. It must assure
the authors that its process is one of publication only - no "copyediting
without author's approval".

> We are building a system to manage trust models. [...]  VOEvent  
> isn't going to swoop in at the end and change how these organizations  
> do business.

Agree 100%. And the most adaptable key-trust architecture is PGP's, thus it is
the most likely to fit whatever trust model(s) end up being required. I look at
it as TCP/IP vs the old ISO stack the early 90's (another bunch of
design-and-decree standards, btw) which were "going to replace it". It didn't
happen because ISO was not flexible and adaptable enough. Here I am again
talking about the bits and bytes, but I had to make that point :-)

> I think we need to characterize the problem before we settle on a  
> solution.  Prototypes can inform our understanding of the problem, but  
> shouldn't constrain the ultimate solution space.

That wasn't my intent in publishing the Note. Again, I'm not a
"design-and-decree" guy either... I wouldn't dare try to constrain the ultimate
solution before fielding some prototypes and using them to mold the technology
as well as the trust models. I see the trust model as an instance of the "simple
accounting package" problem, where the customer doesn't know what they want
until they get something, then the spiralling cycle of desire and fulfillment
begins...

I think we need to get something out there, start using it, learn from it, and
get focus. The very last thing that should come out of that process is the
ultimate solution, the standard. So I think we're on the same beam there...

I may reply to more of your message soon. Thanks a TON for taking the time to
write up those thoughts... your message is probably sort of a watershed event.

  -- Bob






More information about the voevent mailing list