|
Post by Chicago Jake on Apr 17, 2009 16:39:24 GMT -6
Hi, computer mavens,
I'm thinking about getting a new LCD monitor for my old Dell desktop machine. Partly to save space, and partly to ramp up the coolness factor.
What I'm curious about is the resolution. My understanding is that LCD monitors have a native resolution, and being fixed-pixel devices (unlike CRTs), they only display in that resolution.
However, my old video card only has certain resolutions available, most of which are considerably smaller numbers than the native resolution of most LCDs on the market. The max that my video card can output is 1280x1024.
So my question becomes: Do I need to find an LCD monitor that matches one of the resolutions of my video card? Or can the monitor up-convert (like an HDTV) whatever video resolution I throw at it? Or am I just so clueless that I am asking the wrong questions in the first place?
Thanks!.......Jake
P.S. - I should add that the machine currently has an NVidia GeForce2 MX video card, which I would not be opposed to upgrading if that enhanced my monitor options.
|
|
|
Post by Exildo Wonsetler Briggs III on Apr 17, 2009 21:43:39 GMT -6
Jake, I don't know if it helps, but I have a 24 inch LCD monitor using an NVIDIA GeForce 8800 GTX card at 1280 X 1024. The video card will do much higher resolution, but things begin to get too small and I like the current resolution just fine.
Gordon I'm sure will pitch in as he's using some big fucker for a monitor, like 35 inches or something . . . . . maybe compensating for a small dick, though I'm not sure. ;D ;D
(Just kidding, Gordon!)
Use to be we worried about the dot pitch on CRT monitors, but I can't remember how that compares on LCD monitors. The 24 inch LCD I have (a Dell) works just fine for me, and I look at HD video all the time and it's crisp as can be (at 720p).
........Bob
|
|
|
Post by Hank on Apr 18, 2009 4:09:49 GMT -6
Jake
I'm by no means a expert in anything to do with computers but my old CRT took a dump recently and I ended up with a nice LCD.
I started out by just setting my res at 800 x 600 to get the type large enough to read it but was always scrolling to the right to pick up half of the screen or in some case just flat out losing half a screen all together. Through trial and error I've since learned that I could set my res at 1024 x768 and get everything on the screen and to increase the type large enough to be readable I increased the DPI from 96 to 120. This gives me a nice crisp clear screen with type large enough to be easily readable.
If there is a better way I'm hoping someone else chimes in !
Hank
Modified to correct a typo.
|
|
|
Post by Ardbeg... innit on Apr 18, 2009 6:18:47 GMT -6
So right you are on all counts Bob!! ;D Just got a tape measure out and sure enough the inverse relation between dick size and monitor size seems to hold up. I have the same video card as Bob, except I have two (there I go again), so that I can run a second monitor to my alternate workstation when needed.
Hanks numbers got me to do some measuring of the monitor. Mine is 29" diagonal (about 25x16 for the actual viewable) and I have the resolution set to the max of 2560x1600... roughly 100 pixels per inch and comparable to Hanks. I have played around with other (lower) resolutions but keep coming back to that, partly because of the crispness of the image, and given that I usually work with my eyes about 20-24" from the monitor things just look right.
Jake, to test what your current card might like on a larger monitor, use the viewable area dimensions of the monitor you are thinking of getting and proportion that to your current monitor, reset (dumb down) the resolution on the current monitor to approximately the same number of pixels per inch and then work on it for a while. You may have to put up with some scrolling to get around on the screen, BUT you will get a handle on the aesthetics and whether the decreased sharpness of the image is something you can handle.
Edited: I see I am straying away from your question... any new monitor should be able to handle your video card output. When the monitor is plugged in, it will set itself to the current video card setting or dumb the video down to the max it can handle (most likely the former in this case).
The real issue is, unless you plan on upgrading your video card, why waste the money on a monitor that can handle higher resolution? Optimum financial answer is to match the video card with the monitor resolution, and in this case, I doubt that anyone is making LCD monitors with lower resolutions than your card is capable of.
|
|
|
Post by Chicago Jake on Aug 16, 2009 12:17:19 GMT -6
Sorry to necro-post, but I just realized I never did close the loop on this thread. I'm doing so now mostly for my own archiving purposes, in case I need to look it up again someday.
Basically, I was incorrect in going to my "settings" area and seeing what my screen resolution options were. As Gordon alluded to in his post, the video card and the monitor "talk" to each other, and windows only shows you the options that are available for your CURRENT MONITOR AND VIDEO CARD. So I could have gone and gotten any monitor and it would have worked fine.
In the end, I decided not to bother with a new monitor at all. I relegated the desktop machine to auxiliary status only, and use the laptop (a much newer and faster machine) as my primary these days.
Thanks, all!........Jake
|
|