[QUANTITY] Choosing default accuracy/units values
Martin @ ROE
mch at roe.ac.uk
Tue May 18 07:24:50 PDT 2004
Brian Thomas wrote:
> On Tuesday 18 May 2004 04:10 am, Pierre Didelon wrote:
>
>>Hi Patrick,
>>
>>Patrick Dowler wrote:
>>
>>>I agree that default units and default accuracy/error should follow the
>>>same rules. However, to be successful the VO really needs to get people
>>>(force them if necessary) to not be lazy, so I would prefer making
>>>specification of units and accuracy mandatory (in the schema) and have
>>>some special values that can be easily used, like <unitless> and <exact>
>>>and <uknownUnits> and <unknownAccuracy>. There should be no default
>>>value.
>>
>>Isn't specila tag values <units>Number</units>, <accuracy>exact</accuracy>,
>><units>unknown</units> and <accuracy>unknown</accuracy> better than
>>special tag names?
>
>
> I don't think so, in particular for the units. Having representation of individual
> values as elements (tags) makes it possible to more easily restrict "valid"
> values for accuracy/units. Tags are more amenable to use within ontologies
> as well.
Can't we do enumerations for values too? Consistent tags with special
values would make it much easier to parse and marshall/unmarshall.
Unless we do something like this:
<units>
<unknown/>
</units>
<accuracy>
<exact/>
</accuracy>
Which might give us the best of both worlds?
As an aside question: I always thought of ontologies as being some
description/modelling combination - how do tags vs values make a difference?
Cheers,
Martin
--
Martin Hill
Software Engineer, AstroGrid (ROE)
07901 55 24 66
More information about the dm
mailing list