Omnimaga

Calculator Community => Casio Calculators => Topic started by: Spenceboy98 on June 16, 2012, 12:31:00 am

Title: Background Color
Post by: Spenceboy98 on June 16, 2012, 12:31:00 am
Is there a way to change the background color of the screen without storing a gigantic sprite?
Title: Re: Background Color
Post by: Eiyeron on June 16, 2012, 06:12:24 pm
Memcpy?
Title: Re: Background Color
Post by: blue_bear_94 on June 16, 2012, 06:14:57 pm
You mean memset.
Title: Re: Background Color
Post by: Spenceboy98 on June 16, 2012, 09:08:29 pm
How do I use memset?
Title: Re: Background Color
Post by: Eiyeron on June 17, 2012, 02:15:33 am
Yup, i was tired and sleepy

So, you use memset like this:
void * memset ( void * ptr, int value, size_t num );
Ptr will be a pointer to the screen, value the color, and num the number of pixel to change.
Title: Re: Background Color
Post by: blue_bear_94 on June 17, 2012, 02:15:06 pm
Also, memset returns the first argument.
Title: Re: Background Color
Post by: Spenceboy98 on June 17, 2012, 05:17:51 pm
Do you happen to know what number Brown is?
Title: Re: Background Color
Post by: Juju on June 17, 2012, 08:10:25 pm
0xA8000000 it is.
Title: Re: Background Color
Post by: Spenceboy98 on June 17, 2012, 08:25:51 pm
0xA8000000 it is.

Brown?
Title: Re: Background Color
Post by: Juju on June 17, 2012, 11:55:04 pm
Actually, get the RGB value for brown, then encode it in 5-6-5 RGB.

So, according to my calculations, 0x9A60.

0xA8000000 is the pointer to the screen, I got confused, sorry.
Title: Re: Background Color
Post by: MPoupe on June 18, 2012, 04:50:15 am
So, you use memset like this:
void * memset ( void * ptr, int value, size_t num );
Ptr will be a pointer to the screen, value the color, and num the number of pixel to change.
memset works with bytes, so you can use it to fill the screen, but only for few colors, where the lower byte equals the higher byte.
Black or white is OK, but brown (0x9A60) not, because 0x9A != 0x60.
Also the num is NOT number of pixels, but number of pixels * sizeof(unsigned short), because 1 pixel (= 1 unsigned short) == 2 bytes.