You are given two 32-bit numbers, N and M, and two bit positions, i and j.
Write a method to set all bits between i and j in N equal to M (e.g., M
becomes a substring of N located at i and starting at j).
EXAMPLE:
Input: N = 10000000000, M = 10101, i = 2, j = 6
Output: N = 10001010100
_
#include<stdio.h>
#include<stdlib.h>
int main()
{
int N,M,i,j;
printf("Enter value of N \n");
scanf("%d",&N);
fflush(stdin);
printf("Enter value of M \n");
scanf("%d",&M);
fflush(stdin);
printf("Enter value of i \n");
scanf("%d",&i);
fflush(stdin);
printf("Enter value of j \n");
scanf("%d",&j);
fflush(stdin);
int a=0,k;
for( k=0;k<j;k++)
{
a= a<<1;
a=a|1;
}
for(k =0;k<i;k++)
{
a=a<<1;
}
N = N &(~a);
printf("value of N is %d",N);
for(k=0;k<i;k++)
M=M<<1;
N=N|M;
printf("value of N is %d",N);
getchar();
}
isnt it give us wrong mask????
say i=2;
j=6;
it gives mask as(i.e ~a)
1111111100000011
but i think from 2 to 6 5 0's are needed????plz tell the above prog is
ok???or not???check by giving any input whose 7thy bit is set...thnx
in advance
--
You received this message because you are subscribed to the Google Groups
"Algorithm Geeks" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/algogeeks?hl=en.