Reputation: 429
I know there are a plethora of frameworks for macOS to make working with graphics easy. In this case I want to make my life hard on purpose for the learning experience, and customizability.
I want to simply make a window of X by Y pixels, make an array of X by Y pixels, fill the array with color data, and have the window display those pixels.
Basically I want to toy around with making my own little engine to draw things so I can learn from the experience. And I don't want to use OpenGL, Metal, or any other framework that does the hard work for me. Simply give me a window and let me color the pixels one by one. Once I learn how to do what I want to do there I can move up to a higher level framework.
So what in macOS will let me do just that? I've looked at a couple of the frameworks but there are too many to make heads or tails of where to really start. Once I know where to start I can figure out the rest from there.
So far the best idea I have is to use Core Graphics, create a pixel buffer, set that to the whole window, and ignore all the other fancy stuff that Core Graphics does for me. I'd like to go a level lower than that if possible.
Upvotes: 3
Views: 1457
Reputation: 90641
I'm not sure if it's possible. It's certainly not officially supported. There are some old, long-deprecated APIs for accessing the framebuffer for the display (not a window). I have no idea if they still work. You would use the Quartz Display Services API to first capture the display(s) and then obtain the framebuffer address. For example, CGDisplayAddressForPosition()
.
Does it really matter for your purposes if you're accessing the real framebuffer vs. accessing an off-screen raster buffer and blitting that to screen with a minimal high-level API call?
Something you might try is using an IOSurface
as the contents
of a CALayer
. It's supported but not clearly documented. Obviously, there's a lot of high-level stuff going on to get the IOSurface
contents to show in the layer, but you don't necessarily need to deal with it, as such.
For good measure, I'll mention the NSDrawBitmap()
function in AppKit. It's perhaps the most low-level-style interface. That's not to say it's the lowest level of the stack. For example, it's very likely that internally it just constructs a CGImage
from the bitmap data and then draws that.
Upvotes: 2
Reputation: 207678
You could try with SFML on macOS if you wish - see here.
Or with CImg (which needs XQuartz) - see other answer to same question.
Or you can mess about with libsvga
in a VirtualBox Linux on your Mac - see here.
Or you can write to the framebuffer on RaspberryPi if you have $30 to spare.
Or you can go even lower-level with the Raspberry Pi framebuffer - see here.
Upvotes: 2