Hygison Brandao
Hygison Brandao

Reputation: 720

360 image visionOS Resolution

Using a 360 image that I have taken with 72MP with a Insta360 X3 I would like to add those images into my Vision Pro and see them surrounding me completely as we expect of a 360 image. I was able to do by performing the described HERE. The problem is the quality. On my 2D window the image looks with great quality.

I will still write down the code:

struct ImmersiveView: View {
    @Environment(AppModel.self) var appModel

    var body: some View {
        RealityView { content in
            content.add(createImmersivePicture(imageName: appModel.activeSpace))
        }
    }

    /// Ref: https://www.createwithswift.com/
    func createImmersivePicture(imageName: String) -> Entity {
        let sphereRadius: Float = 1000
        let modelEntity = Entity()
        let texture = try? TextureResource.load(named: imageName, options: .init(semantic: .raw, compression: .none))
        var material = UnlitMaterial()
        material.color = .init(texture: .init(texture!))
        modelEntity.components.set(
            ModelComponent(
                mesh: .generateSphere(
                    radius: sphereRadius
                ),
                materials: [material]
            )
        )
        modelEntity.scale = .init(x: -1, y: 1, z: 1)
        modelEntity.transform.translation += SIMD3<Float>(0.0, 10.0, 0.0)
        return modelEntity
    }
}

Since the quality is a problem. I thought about reducing the radius of the sphere or decreasing the scale. On both cases, nothing changes. I have tried: modelEntity.scale = .init(x: -0.5, y: 0.5, z: 0.5) And also let sphereRadius: Float = 2000, let sphereRadius: Float = 500, but nothing is changed. I also get the warning:

IOSurface creation failed: e00002c2 parentID: 00000000 properties: {
    IOSurfaceAddress = 4651830624;
    IOSurfaceAllocSize = 35478941;
    IOSurfaceCacheMode = 0;
    IOSurfaceMapCacheAttribute = 1;
    IOSurfaceName = CMPhoto;
    IOSurfacePixelFormat = 1246774599;
}
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceCacheMode
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfacePixelFormat
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceMapCacheAttribute
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceAddress
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceAllocSize
IOSurface creation failed: e00002c2 parentID: 00000000 property: IOSurfaceName

Is there anything I can do to reduce the radius or just to improve the quality itself?

Screenshots of the Vision Pro screen below. I know here it looks kind of ok, but when we are using the device, it would be better to have a better quality. (I reduced the quality here, so I could upload, but you can check its full quality at -> https://github.com/hygison/Space3DVisionPro/blob/main/Space3DVisionPro/Assets.xcassets/building_park.imageset/building_park.jpg) enter image description here

Note the reason why the quality far from the center looks worst because visionPro automatically makes the quality of what you are not looking worst. And since I am looking at the center, there is the best quality of the image, and that is the point where I am not happy enough. Because even though that image looks ok on the 2D for us, on 3D it seems not that good. If you have a Mac with iOS simulator you can that and check it out from my repository. Better quality images and the code I am using can be found below

Repository: https://github.com/hygison/Space3DVisionPro

Upvotes: 0

Views: 78

Answers (0)

Related Questions