The reason why the transition to higher density displays has been such a cluster is because devs have outright refused to use a device independent coordinate systems. UI systems should have converted to real world units a long time ago. I honestly couldn't care less what system people use, measure in millimeters or thous, whatever makes you happy. For now DPI/PPI is the industry standard, and you'll have as much luck holding back that ocean as convincing people to start saying "mebibytes."bobingabout wrote:The units on the screen are measured in pixels. if we're not allowed to use pixels, what are we supposed to use?Omnifarious wrote:Please, please stop measuring things, anything, in device dependent coordinates. Please stop using pixels. IMHO, nothing should ever surface pixels as a coordinate system, ever!
The use of pixels has caused enormous amounts of pain as people switch to 4k displays. They will cause enormous amounts of pain in the future when display technologies change again. Please stop using them. If you must, surface a 'DPI' configuration or something and have it operate at a very low level so coordinates used in almost all parts of the code do not depend on any details of the display like pixels.
We've known not to use pixels for literally decades, and yet people keep using them. Please stop.
Other than that, I like the new UI.
if you have 200% screen scaling turned on, that's not our problem, and the OS itself SHOULD account for that (even the scaling in the game accounts for that)
DPI? okay, say you have a 24 inch 4K screen, or a 52 inch 4k screen. How many inches should a button take up? (also, seriously, inches? I know DPI is a common term and I'm not sure what the metric equivalent should be, but nobody uses inches anymore.) Honestly, that statement is like saying "I want this icon to be as big as a fish." How big is a fish? is it one of those teeny tiny ones a few millimetres long, or the giant 4 footers you find out in the deep ocean? The computer has no way of knowing (nor should it care) the physical size of your screen, but it does know something, How many pixels across, by how many pixels down. Why shouldn't it use pixels to measure things?
And don't blame software engineers for screens having a higher pixel count. This is one reason why my latest screen has a resolution of 2560x1440, and not 3840x2160, especially with my ageing eyes, I wouldn't have been able to see the detail that fine. If you can't even see the increased pixel density, don't get it!
Displays have been capable of reporting their physical dimensions for a long time now. UI systems should have adopted a decade ago. "This button should be 1.2cm wide" not "this button should be 50 pixels wide". If this had been made standard, the only thing that'd change with a 4k display is fidelity and not UI element size.
Of course, we're still dealing with displays with rectangular pixel layouts on PC monitors, but phones have already moved on to things like PenTile layouts where the concept of a pixel is nebulous. Using a real world metric for UI element measurements is the most sure way to future proof an application. Who knows what direction displays may take, or what a pixel means in something like, e.g., VR.
However, that said, Factorio isn't really in a position to start this trend. It really needs to start at the OS and work its way into standard widget libraries and things like DirectX first. Funnily enough, CSS is ahead of the curve on this probably due to how many different kinds of systems it has to work with. Omnifarious is 100% correct. UI design should have abandoned pixels a long time ago. How many pixels are in a mm of space is something only the rasterizer needs to deal with, assuming your display device even uses a rasterizer.
All that aside, I'm loving the new UI design, and the new office.
![Smile :)](./images/smilies/icon_e_smile.gif)