This targets #270
* Use js URL object to detect if the user gives an url
* There is important point: a user can give a path pointing towards a local HiPS. For those Aladin Lite will think the path is an ID but it is not. That is why after failing fetching the MocServer for its properties, we simply try to reconsider it as an URL so that a local HiPS can be load afterwards.
* use Utils.copy2Clipboard in contextmenu and shareview
* check for a mousedown before computing distance from the position when the mouse has been clicked
* smartphone 2 fingers pinched rotation between lon pi and 2*pi seems to have been fixed. The bug seem to be there from a long time ago.
Change heuristic for showing up the contextual menu. Before we were waiting 100ms before changing the cuts but this does not work for users doing long timed right click without moving. Now, we look for the mouse offset after rightclicking. If it exceeds an offset of 10px then the contextual menu will not open and the cuts will change instead.
MOC settings after their creation was not possible. This PR fix it.
It is also possible to directly set the 'color', 'fillColor', 'opacity'
and 'lineWidth' MOC properties without doing any reportChange
afterwards. These settings will automatically notify the wasm part for
change of the MOC options and will update the view.
* fix: returning nothing or an invalid value from it will displays the
source as the square shape. Before, it was forgetting to display the
source
* Possibility to return a string in "rhomb", "circle", "square",
"cross", "triangle" from it.
* Set the color of footprint to the color of the Catalog. This thus
overwrites the color that could have been given in the footprint
directly. To compensate that, I think it could be great to allow a color func as well.
Two modes of display:
* ICRSd & GALACTIC frame set the formatting of grid labels to decimal
with digit precision being computed from the grid step selected
* ICRS frame set the formatting to sexagesimal in the format: deg min
sec.ddd .
This fixes#172
float textures coming from BITPIX<0 fits images are sent to GPU as
RGBA8UI textures. Float decoding from a vec4 rgba is done in the shader.
Then this obtained decoded float can be tested against nan/inf in the
shader