Reputation:
I have dual head (VGA output, DVI or HDMI output) from 1 PC using this xrandr --output VGA1 --left-of LVDS1
. Each having 1024x768 resolution.
When i use this using Java:
screen = Toolkit.getDefaultToolkit().getScreenSize();
I get a huge two screen width together. As a result my Width and height is huge. Where i only need to place my application in one screen either VGA or DVI. But using Java how do i know that?
How do i tell that using Toolkit.getDefaultToolkit() ?
Example: (my application has to run where Java is labeled)
Test.java
import java.awt.GraphicsDevice;
import java.awt.GraphicsEnvironment;
public class Test {
public static void main(String[] a) throws Exception {
GraphicsEnvironment ge = GraphicsEnvironment.getLocalGraphicsEnvironment();
GraphicsDevice[] screenDevices = ge.getScreenDevices();
for (int i = 0; i < screenDevices.length; i++) {
System.out.println(screenDevices[i].getIDstring());
DisplayMode dm = screenDevices[i].getDisplayMode();
int screenWidth = dm.getWidth();
int screenHeight = dm.getHeight();
System.out.println("Cake: " + screenWidth + " " + screenHeight);
}
}
}
Output:
:0.0
Cake: 1024 768
:0.1
Cake: 1024 768
Upvotes: 1
Views: 796
Reputation: 2028
I think
GraphicsEnvironment.getLocalGraphicsEnvironment().getScreenDevices()
is what you are looking for. And then from there you can iterate through the screens and get the screen dimension by getDisplayMode()
Upvotes: 2