Reputation: 1823
How to get stream bytes and convert back to image, below is my code. But the images I get is blank. Why? Something wrong of the code?
I get the bytes from my NSOutputStream and convert back to NSData, then convert the NSData to image.
(void)stream:(NSStream *)theStream handleEvent:(NSStreamEvent)streamEvent {
switch (streamEvent) {
case NSStreamEventOpenCompleted:
NSLog(@"Stream opened");
break;
case NSStreamEventHasBytesAvailable:
if (theStream == inputStream) {
uint8_t buffer[5000];
int len;
while ([inputStream hasBytesAvailable]) {
len = [inputStream read:buffer maxLength:sizeof(buffer)];
NSLog(@"len=%d", len);
if (len > 0) {
NSData *pictureData = [NSData dataWithBytes:buffer length:len];
UIImage *imagess = [[UIImage alloc]initWithData:pictureData];
[imagesview setImage:imagess];
}
}
}
break;
}
}
Upvotes: 0
Views: 2677
Reputation: 41642
This is a bit confusing in your use of NSOutPutStream. While you may be sending the image data with one (far end), you must be using a NSInputStream to receive it, no?
What you should be doing in 'NSStreamEventHasBytesAvailable:' is simply appending the data to a mutable data object. When you finally get 'NSStreamEventEndEncountered', that is when you would get your image:
{
NSMutableData *data; // ivar
}
// init it somewhere
data = [NSMutableData new];
case NSStreamEventHasBytesAvailable:
if (theStream == inputStream) {
uint8_t buffer[5000];
int len;
while ([inputStream hasBytesAvailable]) {
len = [inputStream read:buffer maxLength:sizeof(buffer)];
NSLog(@"len=%d", len);
if (len > 0) {
[data appendBytes:(const void *)buffer length:len];
}
}
}
break;
case NSStreamEventEndEncountered:
{
if (theStream == inputStream) {
UIImage *imagess = [[UIImage alloc]initWithData:data];
[imagesview setImage:imagess];
}
} break;
Upvotes: 1