Monitor resolutions gets bigger and bigger. But even at 1900x1200 we still need to use antialiasing.
My question is when pixels got so small that antialiasing becomes pointless? Which resolution would be borderline there?
Monitor resolutions gets bigger and bigger. But even at 1900x1200 we still need to use antialiasing.
My question is when pixels got so small that antialiasing becomes pointless? Which resolution would be borderline there?
Well, this also depends on the size of the display IMHO. On a 10 inch tablet with 1920x1080 resolution, an image would look "smoother" than on a 27" monitor.
Best regards,
Cybermonkey
Let's assume it is 22" one.
Well when working with graphics for your games you should probably consider anti-aliasing as part of the texture data rather than a technique for drawing. Any scaling on your GPU will display your texture with what you might consider anti-aliasing due to the conversion to screen from scaling the original image.
What type of graphics/textures and usage are we talking about here?
I haven't really noticed development in the "pixels per inch" ratio on PC monitors, instead i think it's only screens that have gone bigger. Too dense ratio with small screen can make OS a little unusable, unless all software is designed with that in mind. Fonts and icons need to scale larger etc.
I think the only screens that have jumpped in size where pixels per inch have such an impact are the newer Mac Books with retina display screens. Something that not even iMacs have, though a 27" retina display would be a bloody monster. ...and the next iMac that I'd upgrade to! But I predict that would be for some time.
The necessity of calculating density and dpi of a monitor still remains on mobile screens and not so much desktop/laptop monitors and screens. I don't think that these values are exposed to the software level anyhow, are they?
Bookmarks