Bug 1372 : background() clears background to incorrect color when passing in hex literal
Last modified: 2010-06-05 03:32




Status:
ASSIGNED
Resolution:
-
Priority:
P2
Severity:
normal

 

Reporter:
Andor Salga
Assigned To:
REAS

Attachment Type Created Size Actions

Description:   Opened: 2009-11-20 08:25
1.)Version: 1.0.9

2.)Hardware:
Macbook Pro
OS X (10.5.8)
2.2GHz Intel Core 2 Duo
2GB DDR2 RAM
GeForce 8600M GT 128MB

3.)
// here is some example code:
import processing.opengl.*;
size(400,400, OPENGL);
background(0x0000FF);

5.) background() can accept a single hex value in the format 0x000000 or
#000000.
If passing in the value 0x0000FF, the background should be blue. Instead,
it is white.
Similarly, if I pass in 0x0000FF00 I get a green background instead of blue.
Additional Comment #1 From fry 2009-11-20 08:41
That's a typo in the reference. If you use 0x syntax, you have to include 8
digits, the first two digits are the alpha color. If you use the # syntax,
it's six digits.
This bug is now being tracked here.