Putting the pieces together...

Thomas McGlynn tam at lheapop.gsfc.nasa.gov
Fri May 14 12:23:17 PDT 2004


Hi Doug,

I don't think we're really disagreeing but I just want to highlight
my concerns by responding to one phrase.

> 
> When you use SSA you will get back a spectrum (or SED) which conforms to the
> SSA data model.  It won't matter what format the data is in so long as your
> application can parse it, since all formats encode the same data object.

What does this mean...  "conforms to the data model?"  Does this
mean that we're getting back a file that's in XML with a schema somewhere
and items identified in the schema?  Data cannot be in FITS?

Or maybe it means that the data can be in FITS or XML, but the data can
be deserialized using some standard applications that recognize
the formats and allow the users some  API to the data?  There's
a set of standard Java interfaces and classes  that are provided by the VO.  Wil
writes an equivalent set that C# users can try.

Or maybe it means that there are converter tools that know how to
transform the data into a standard representation for a given data model?
The converter tools are invokable from a few standard Web sites.

These are three very different ways I could see a data model being
instantiated usefully within the VO.  There are probably others, CORBA
and IDL could probably play a role.  I could understand what conforming
to a data model means if it meant one or more of these!

Or does it mean, as you seem to imply below, that the data model is just
a bit of documentation that is somehow associated with the file but
every application needs to build their own custom parsers?  I surely hope that's
not what it is.

To me the real purpose of data models is that I can
build high level interfaces over the data model and hide
low level implemenetation details.  The VO provides me
with some way of getting the standard information  regardless
of the low level stuff.  But if everyone has to build their
own low level interfaces that's not going to win us any friends!  Every time
a new low level format comes along everyone needs to update their codes.  That can't
be right.

If the data model is only documentation then
I think we've wasted a lot of time on this.  Just as resource
metadata isn't very helpful without registries
that give us common access to metadata from many resources, data model
metadata isn't very helpful unless we have API's that give us common
access to datasets which implement these models.

What are the software capabilities that the data models are associated with?

	Regards,
	Tom

> The data model merely defines how the spectrum is put together and what
> the elements are, so that you can write software to deal with it.  It is
> no different from getting a spectrum from an archive or data analysis
> system now, except that more effort has done into formally documenting
> the science data model of the spectrum, and the representation in terms
> of some external data format.
> 
>  



More information about the apps mailing list