In attempting to cross-compile a program for both WebGL and Desktop OpenGL, I've noticed that the Attrib type is uint for the former, and int for the latter. This makes it a little hard to manipulate these in a portable way.
Is there a technical reason why these are different, or could they be brought into alignment?