Reputation: 101
I'm working on a python application that controls mouse movement.
I have absolute mouse position working perfectly, using the Quartz.CoreGraphics library, which exposes some CGEvent commands for mouse control (like "CGEventCreateMouseEvent" and "CGEventPost").
However, I can't find anything in the docs about relative mouse movement. I would really like to simulate actual mouse movements (i.e. "x sideways, y up" instead of "x,y"), because some of the people using my application have multiple monitors, and I imagine it would be a lot easier just to inform the OS that there was a mouse movement rather than setting the position myself.
The nature of my interface also lends itself to relative mouse movement.
In Windows, there is a function in the win32 API that allows for "raw" mouse commands that can do exactly what I am looking for. Is there any way to achieve this in OS X?
Upvotes: 1
Views: 1758
Reputation: 29657
I do not think that it's not possible with the way that events are managed. You need to capture the old (X,Y) and calculate the delta yourself. This means that you'll have a problem when you hit the end of the screen.
The good news is that you can move the mouse. So if the mouse hits the edge, you can reposition it to the center. You can also make the mouse pointer invisible. Finally, you can catch the mouse movements with a tracking rectangle that covers the entire screen. So the good news is that you can simulate precisely what you want to do , but the bad news is that it will take some work.
Useful APIs for this include:
CGEventTap
(See Quartz Event Services ReferenceCGPostEvent
CGDisplayMoveCursorToPoint
(See Quartz Display Services Reference)Other SO references include:
Upvotes: 1