Reputation: 184
I'm trying to cast the screen from an iOS device to an Android device.
I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression.
While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android.
Data transmission over the TCP socket seems to be functioning correctly.
My question is:
Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms?
Here's a breakdown of the iOS sender details: Device: iPhone 13 mini running iOS 17 Development Environment: Xcode 15 with a minimum deployment target of iOS 16 Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers Video Compression: VideoToolbox for H.264 compression Compression Properties:
kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate) kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level) kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval) kVTCompressionPropertyKey_RealTime: true (real-time encoding) kVTCompressionPropertyKey_Quality: 1 (lowest quality) NAL Unit Handling: Custom header is added to NAL units Android Receiver Details:
Device: RedMi 7A running Android 10 Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
Tried with switching to Main profile in iOS.
Upvotes: -1
Views: 145
Reputation: 11
The output of VideoToolbox is in AVCC format, which needs to be converted to Annex B. You can refer to the following code:
#define ANNEXB_NALU_STARTCODE_SIZE 4
#define AVCC_NALU_STARTCODE_SIZE 4
NSMutableData *naluData = [[NSMutableData alloc] init];
const char startCodes[] = "\x00\x00\x00\x01";
bool keyframe = !CFDictionaryContainsKey( (CFArrayGetValueAtIndex(CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, true), 0)), kCMSampleAttachmentKey_NotSync);
if(keyframe){
CMFormatDescriptionRef format = CMSampleBufferGetFormatDescription(sampleBuffer);
size_t spsSize, ppsSize;
size_t parmCount;
const uint8_t *sps, *pps;
OSStatus statusCode = noErr;
statusCode = CMVideoFormatDescriptionGetH264ParameterSetAtIndex(format, 0, &sps, &spsSize, &parmCount, 0 );
ReturnIfError(statusCode);
statusCode = CMVideoFormatDescriptionGetH264ParameterSetAtIndex(format, 1, &pps, &ppsSize, &parmCount, 0);
ReturnIfError(statusCode);
[naluData appendBytes:startCodes length:ANNEXB_NALU_STARTCODE_SIZE];
[naluData appendData:[NSData dataWithBytes:sps length:spsSize]];
[naluData appendBytes:startCodes length:ANNEXB_NALU_STARTCODE_SIZE];
[naluData appendData:[NSData dataWithBytes:pps length:ppsSize]];
}
CMBlockBufferRef dataBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
size_t totalLength;
char *dataPointer;
OSStatus statusCodeRet = CMBlockBufferGetDataPointer(dataBuffer, 0, NULL, &totalLength, &dataPointer);
if(statusCodeRet == noErr){
size_t bufferOffset = 0;
while (bufferOffset < totalLength - AVCC_NALU_STARTCODE_SIZE){
uint32_t naluLength = 0;
memcpy(&naluLength, dataPointer + bufferOffset, AVCC_NALU_STARTCODE_SIZE);
naluLength = CFSwapInt32BigToHost(naluLength);
NSData* data = [[NSData alloc] initWithBytes:(dataPointer + bufferOffset + AVCC_NALU_STARTCODE_SIZE) length:naluLength];
[naluData appendBytes:startCodes length:ANNEXB_NALU_STARTCODE_SIZE];
[naluData appendData:data];
bufferOffset += AVCC_NALU_STARTCODE_SIZE + naluLength;
}
//Your code for sending NALU ...
}
Upvotes: 0