Getting wrong colour with LAN API

When I try to use the colour #0099ff, I get the following;
Normalised (0-1) RGB values; R: 0, G: 0.6, B: 1 (as expected)
Normalised (0-1) HSV values; H: 0.5666667, S: 1, V: 1 (as expected, H is scaled to 0-1 from 0-360, as it is all multiplied by 0xFFFF after)
Entire payload: 31 00 00 34 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 66 00 00 00 00 91 10 FF FF FF FF AC 0D 00 00 00 00 (49 bytes)
Important? part: 00 [91 10] [FF FF] [FF FF] [AC 0D] 00 00 00 00
As far as Iā€™m concerned, all the values are correct, but when I broadcast it (C#) on 56700, it gives me a red light, instead of the medium blue I was expecting. Any ideas what might be causing this?

Thanks!

My bad; I was using big endian the whole time. Oops.

1 Like