Ocean Color Forum - Not logged in
I'd like to be able to use SeaDAS with data that has discrete levels (e.g., the Longhurst biogeochemical provinces). In this case, the map coloring theorem tells us that only a few colors are needed, although it may be convenient to use more. Suppose I have a SeaDAS mapped image with levels 0-100 (for instance). I can generate a color table that assigns colors to the first 101 levels and sets the rest to some default value, but when I load the table in SeaDAS, step C. in sdsloadct.pro decimates the input table to reduce 256 levels to the portion of the SeaDAS palette reserved for data, e.g., not the intended assignment. It would be helpful to have yet another option to read only the number of entries from a color table that are needed for the data section of the SeaDAS palette.
I may be able to get around this by careful recoding to assign a color index to each pixel, but then the pixel values are no longer meaningful (e.g., several regions may be assigned the same color).
Suggestions? Could categorical data be displayed using something like the existing flags tools?
If I've understood you're problem correctly, I believe there is a way to do what you want
with relative ease (though the solution *might* pose additional problems).
SeaDAS allows for modification of the color table configuration. Under the 'Utility' section of the SeaDAS Main Menu, there is a function 'setlut' (this can also be reached via the Seadisp Main Menu under 'Global Setup') The 'setlut' widget allows you to modify the number of concurrent color tables, as well as set the color table index range.
The solution for you would be to redefine the color table index ranges. The default is to have two color tables with the first filling 0-196, the second 197-244 (the remaing colors are reserved for display annotations). You would want to redefine the first color table index range to be 0-100. Your should then fix your colortable to the desired 101 colors (i.e. remove the 101-255 values). Then when you load this 101 element color table into the first color table using the 'Load Color' function, the data will be color mapped to the appropriate colors.
I hope I've not confused you further
and that this works for you.
I'm able to verify (using the 'Change Color" tool) that the LUT has not lost any colors, but in the display the color assignments are still not correct. If I use, say 0 for region X and 1 for region Y, and set the scale to use raw values, 0. to 255., then 0 and 1 come up with the same color. Adding a small constant to each data value with a user function causes different color confusions. This suggests that data values are scaled and converted to byte values using decimation analogous to that is applied to the color table in step "C".
Another way to check a color display using user specified CT is read the CT directly (ASCII file):
(1) Use IDL predefined CT (ie. 16 Level) as a reference. Save this CT as ASCII file.
Functions->Color LUT->Save Color
(2) Modify this CT and load it back to check the display.
Color levels can be reduce from 256 to 16, 8, 4 .... The display seems right for me.
Please send me an Email, I can show you my screen display and my demo CT.
I have this working now. Thanks for all the suggestions. 2 "tricks" are required:
1. When loading the ASCII LUT, set "Flags, bottom" to the number of ntries being used from the color table. This avoids the decimation step "C" in sdsloadct.pro. Don't change "valid range" from the default [0,196]. You can use the color table editor to check that the colors have been read correctly.
2. With the image loaded, use "Setups/rescale" to scale the data from 0 to 196 using raw values. This surprised me, as I expected data read as byte values should be scaled to [0,255] to preserve the actual levels, but it seems that if you get scaled data values over 196 then everything is scaled to the [0,196] "valid range".
I'd be grateful if someone can tell me the location of the code that does the scaling of the data values.