VGA or D-Sub connectors are found on old monitors, and these cords will not be able to provide the 144Hz rate you desire. Because these older connectors can only provide a maximum of 75Hz, and the resolution is 1920×1200.
Even on short cables, VGA is rarely as sharp as a digital link. Unlike other standards, VGA has a unique way of managing with reduced bandwidth; most hardware has the ability to automatically lower the screen resolution in order to preserve video fidelity.
When it comes to VGA vs HDMI, HDMI is much better than VGA, for a number of reasons. Not only is HDMI capable of transferring more data (which translates into higher resolutions and higher frame rates) but it can also carry over audio. In short, HDMI delivers a much clearer image quality.
It can handle 1080P, but if you want anything better you need to use a Display Port or HDMI cable. VGA is an analog specification so maximum would only be theoretical as you would need to match hardware (video card and monitor) to the same specifications to find a common maximum.
VGA doesn't necessarily mean that the monitor is a poor choice for gaming. VGA used to be the dominant connection between a computer and monitor, and there are many older monitors that have a VGA input as well as more modern digital interfaces like DVI, DisplayPort, or HDMI.
While the original VGA (VGA standard) had a maximum resolution of 640×480, nowadays the VGA cable is capable of 1080P and higher resolutions.
| HDMI Version | 1.0–1.2 | 1.3–1.4 |
|---|
| Resolution | Maximum Refresh Frequency* |
| 2K 1920 × 1080 (16:9) 1920 × 1200 (16:10) | 60 Hz60 Hz | 144 Hz120 Hz |
| 2.5K 2560 × 1080 (≈21:9) 2560 × 1440 (16:9) 2560 × 1600 (16:10) | 50 Hz 30 Hz 30 Hz | 100 Hz 85 Hz 75 Hz |
| 3.5K 3440 × 1440 (≈21:9) | 30 Hz | 60 Hz |
VGA is a video connector of the past and isn't very advantageous now. The maximum refresh rate (thus maximum FPS able to be shown on screen) is 60FPS with a maximum resolution of 640 x 480.
VGA - 150 feet (regular); 650 feet (with extender)VGA is an analog signal and will get weaker over longer distances. For high-quality video, the maximum recommended distance is 25 feet.
It ensures backwards compatibility with older computers which use legacy graphics cards. That being said, most newer mainstream monitors now exclusively use DVI-D, DP or HDMI. The VGA connector is mostly found on cheaper or more accessible models geared towards office use or home users.
No, VGA is an old analog standard, you should not use it (i doubt any 1440p monitors even have a VGA in on the back). You should use DP or HDMI to get 60hz 2560x1440.
As such, the VGA signal will be the limiting factor of the final output signal. In other words, converting VGA to HDMI will not improve the signal quality of the original output. Similarly, Converting HDMI to VGA will possibly entail a small loss of signal quality.
HDMI 2.0 is also fairly standard and can be used for 240Hz at 1080p, 144Hz at 1440p and 60Hz at 4K. The latest HDMI 2.1 adds native support for 120Hz at 4K UHD and 60Hz at 8K.
VGA stands for Video Graphics Array. A VGA cable is a device used to transfer video signals. It does this by acting as a link between the computer and the monitor or between the computer and the television screen. The video graphic cable comes in two types, male and female connector.
Registered. If it states it can do 75hz at 1080p on hdmi then it can do it. Also there seems to be a lot of misconception around, some people say hdmi can only do 60hz max at 1080p and such but truth is that it depends on your monitor and on the cable you're using.
You should be able to get 75 Hz if the monitor supports Dual-Link signals, but that's up to the monitor manufacturer and isn't typically specified, so there's really no way to know. If it doesn't, it'll be limited to 60 Hz.
DVI stands for Digital Visual Interface. DVI cables are used to connect a video signal from computers to LCD monitors, HDTV displays, projectors, and cable boxes.