Reputation: 43
How do I set the bits of a variable to assume the pattern I want? For exemple, if I have to print this sequence, how do i proceed?
11010010 11010010 11010010 11010010
I wrote the code that print and separate bits in this configuration, but don't know how to set them as I want.
#include <stdio.h>
#include <limits.h>
int a;
void stampabit(a);
int main()
{
int i;
int n= sizeof(int) * CHAR_BIT;
int mask = 1 << (n-1);
for (i=1; i<=n; ++i){
putchar(((a & mask)==0)?'0':'1');
a<<=1;
if(i%CHAR_BIT==0 && i<n)
putchar(' ');
}
}
Upvotes: 3
Views: 423
Reputation:
Edit:
if you need just set some bits, simply use bitwise or,
and if you need to clear some bits, use bitwise and:
uint32_t a=1;
a |= 0xD2D2D2D2 ; //set bits to 11010010 11010010 11010010 11010010
a &= ~1; // clear first bit (or mask)
and it is good to separate display code from main code to have clear code:
#include <stdio.h>
#include <stdint.h>
void print_bits(uint32_t u){
int i=31;
do{
if (u & 0x80000000) putchar('1'); else putchar('0');
u <<= 1;
if(i%8 == 0) putchar(' ');
}while( --i >= 0);
}
int main(){
uint32_t a=1;
a |= 0xD2D2D2D2 ; //set bits to 11010010 11010010 11010010 11010010
a &= ~1; // clear first bit (or mask)
print_bits(a); //output: 11010010 11010010 11010010 11010010
}
Old:
if you need just set some bits, simply use bitwise or like this:
void stampabit(int setBits){
a |= setBits;
}
to set your pattern 11010010 11010010 11010010 11010010 use this:
stampabit(0xD2D2D2D2 ); // stampabit(0b11010010110100101101001011010010);
working example:
#include <stdio.h>
#include <limits.h>
//#include <stdint.h>
int a=0;
void stampabit(int setBits){
a |= setBits;
}
int main()
{
int i;
stampabit(0xD2D2D2D2 ); // stampabit(0b11010010110100101101001011010010);
int n= sizeof(int) * CHAR_BIT; // number of bits in int (8, 16, 32, 64, ... bits)
int mask = 1 << (n-1); // sign bit mask
for (i=1; i<=n; ++i){
putchar(((a & mask)==0)?'0':'1');
a<<=1;
if(i%CHAR_BIT==0 && i<n)
putchar(' ');
}
}
output:
11010010 11010010 11010010 11010010
Some notes:
here it is just sample code and it works fine, but using unsigned types is clearly shows you don't need sign bit, and it is bug proof.
using local variables or shifting local variables is better than using or shifting global variables, unless it is intentional.
last but not least if you don't need platform dependent int use int32_t or uint32_t from
#include <stdint.h>
and if you need to clear some bits, use bitwise and:
uint32_t a=1;
a |= 0xD2D2D2D2 ; //set bits to 11010010 11010010 11010010 11010010
a &= ~1; // clear first bit (or mask)
I hope this helps.
Upvotes: 0
Reputation: 16213
You must shift mask instead of shift the variable
#include <stdio.h>
#include <limits.h>
unsigned int a = 0xAA55AA55;
int main()
{
size_t i;
unsigned int n= sizeof(int) * CHAR_BIT;
unsigned int mask = 1 << (n-1);
for (i=1; i<=n; ++i){
putchar(((a & mask)==0)?'0':'1');
mask>>=1;
if(i%CHAR_BIT==0 && i<n)
putchar(' ');
}
putchar('\n');
}
Output will be
10101010 01010101 10101010 01010101
Changing value of a
to 0xD2D2D2D2
as you want output will be
11010010 11010010 11010010 11010010
Upvotes: 1