Monitor resolutions gets bigger and bigger. But even at 1900x1200 we still need to use antialiasing.
My question is when pixels got so small that antialiasing becomes pointless? Which resolution would be borderline there?
Monitor resolutions gets bigger and bigger. But even at 1900x1200 we still need to use antialiasing.
My question is when pixels got so small that antialiasing becomes pointless? Which resolution would be borderline there?
Well, this also depends on the size of the display IMHO. On a 10 inch tablet with 1920x1080 resolution, an image would look "smoother" than on a 27" monitor.
Best regards,
Cybermonkey
Let's assume it is 22" one.
Well when working with graphics for your games you should probably consider anti-aliasing as part of the texture data rather than a technique for drawing. Any scaling on your GPU will display your texture with what you might consider anti-aliasing due to the conversion to screen from scaling the original image.
What type of graphics/textures and usage are we talking about here?
I haven't really noticed development in the "pixels per inch" ratio on PC monitors, instead i think it's only screens that have gone bigger. Too dense ratio with small screen can make OS a little unusable, unless all software is designed with that in mind. Fonts and icons need to scale larger etc.
I think the only screens that have jumpped in size where pixels per inch have such an impact are the newer Mac Books with retina display screens. Something that not even iMacs have, though a 27" retina display would be a bloody monster. ...and the next iMac that I'd upgrade to! But I predict that would be for some time.
The necessity of calculating density and dpi of a monitor still remains on mobile screens and not so much desktop/laptop monitors and screens. I don't think that these values are exposed to the software level anyhow, are they?
Yes they are! YOu can get screen DPI using:
NOTE: PixelsPerInch is only acurate for vertical measurment.So to get vertical and horizontal values for PixelsPerInch you actually need to calculate them using Screen resolution. So it is a bit of work.Code:DPI := Screen.PixelsPerInch
Well, Asus recently announced 4K PC Monitors (with 113 ppi), though I don't think that'll become a standard soon. Most GPUs aren't capable of such big resolutions, and the price tag is hefty too.
As for anti aliasing : I don't think it'll go away anytime soon. Aliasing can occur even at very high resolutions, and I don't see any reason why GPU vendors will want to remove AA from their hardware.
I'm not saying about removing AA for hardware - that would be disaster for people with older monitors, but at some point AA won't be actually needed anymore.
Let's face it - AA, especially big one like 4x or 8x, is one of factors that albeit make everything look smoother, takes away some of precious FPS.
High resolution takes away some FPS too, but memory especially for screen buffers. Perhaps wait 10 more years in future for the tech improvements, and we might see some serious pixel size reductions Ideally it would be just nanoscale surface that changes molecular properties or something...
Bookmarks