TRENDING NEWS

POPULAR NEWS

Suggest Me A Graphics Card With Hdmi Output

GTX 970 Graphics Card - Why does the spare DVI output not work at the same time as using the DVI-D output?

"GTX 970 Graphics Card - Why does the spare DVI output not work at the same time as using the DVI-D output?"

Depends on the manufacturer. When using the DVI-D, they may want you to use both channels expressed at the DVI-D. So you could just get a DVI-D to dual individual DVI splitter cable.

"I've just upgraded to this card & thought using the 2nd output would let me extend the monitor as i like to have it plugged into the TV so i can do use that similtanously."

Sure.

"Worked fine using the old card, but for some reason this will only extend monitors if you use the spare HDMI, which i do not have a cable for :("

They make an HDMI to DVI cable (or adapter) also.

"The 2nd output works if you take the 1st DVI out but this is just problematic, anyone had this issue or know why i can't extend my monitor to it."

The manufacturer is a ninny? What brand?

I learned to hate PNY, because they could never implement an HDMI port very well... turned out to be I had a bad / marginal HDMI cable...

HDMI wont fit on Graphics card??!!! HELP?

The GTX 680 comes with
(1) HDMI,
(1) DisplayPort
(2) DVI ports
...as standard...unless the manufacturers (ASUS, EVGA, MSi, etc) decides to alter it.

Are you SURE you are trying to connect it to the HDMI port and NOT the DISPLAYPORT?! They seem similar but are two very DIFFERENT things.

If you are still having trouble, do you mind taking a picture of your HDMI cable and a picture of which port you are trying to connect to?

HDMI male / female should just connect effortlessly...

Can I connect the HDMI output of a graphics card of my PC to my TV's HDMI ARC? Will be this connection like normal HDMI?

You can, but I would not recommend it if you have a receiver which supports ARC, in which case that receiver should connect to the ARC-enabled HDMI input on the TV. If you connect your computer to a different HDMI input of the TV, the ARC connection can be used to send the audio to the receiver. That is actually important because video processing in the TV delays the video and the sound needs to be similarly delayed, which the TV does do. If you bypass the TV for the audio (as is possible with a set top box), you wind up with audio/video synch problems.(I should also mention that there other ways to route the delayed audio from a TV to a receiver. But if you have a receiver which implements ARC, you do want to connect it to the ARC-enabled HDMI input on the TV.)

What would be a good graphics card to supply output for six monitors, not being used for gaming?

Well, I don’t know about pricing or your monitors, but DisplayPort is the only way you’re gonna make that happen without sheeling big bucks on more than one GPU. An RX 480 4gb/GTX 1060 3gb should suffice for graphical capabilities, plus are not too pricy if you end up needing a second one. DisplayPort 1.2 allows you to run 2+ monitors off of a singular DisplayPorr socket. So as long as the monitors and your GPU support DisplayPort, you should be fine with anything newer than a GTX 900 series or RX 300 series. However, if you use HDMI, I would recommend either dual GTX 1050’s or dual RX 460’s, as a single cars will not have enough ports to support your monitor needs.Hope this helps!

Will a video card with an HDMI 2.0 output work with a 1080p TV/monitor with only an HDMI 1.4 input?

Revisions to HDMI don’t change the basic functionality, but instead add new capability as tech improves. 2.0 is 1.4 as much as it is 2.0. But, needless to say, a 1.4 cable will not do if you need capability found only in 2.0. Here’s a chart so you can understand what the revisions allow.As you can see, the only real difference between 2.0 and 1.4 is that the former can do 4k 60fps (plus associated audio and color channel improvements) and the latter can’t.

Can a DDR2 Graphics Card in DDR1 motherboard?

hi gopakumar
dont get graphics memory confused with motherboard memory... they are totally seperate and different.... for example your motherboard supports ddr 1st generation memory however it can safely use the geforce 9400gt which has ddr2 memory.... as long as you have the correct graphics interface on your motherboard ( e.g pci-express,agp )
the latest graphic cards use gddr5 memory however these can be used on motherboards that use ddr2 system memory....basically like i mentioned as long as the card can physically fit in your graphic card bus ( slot ) and you psu (power supply ) has enough power then your fine
a word on the geforce 9400gt.... this is quite old and there are other similar priced card that can outperform it.... for example the "PALIT GT 430 1GB DDR3 DVI VGA HDMI Out PCI-E Graphics Card"( http://www.ebuyer.com/product/241856 ) is one of the latest cards from nvidia and has some excellent specifications.... these include a 700mhz core clock and 1gb of ddr3 memory running at 800mhz(1600mhz ddr effective ) ... and with a 128bit memory interface and memory bandwidth of 25.6gb/sec this is much better and faster than the geforce 9400gt
the PALIT GT 430 also supports a 40nm gpu(graphics processing unit ) which equates to a much cooler chip and thus means it run quieter and also requires alot less power.... infact the PALIT GT 430 only draws a maximum 50w of power which means it requires no additional power connectors .... infact it should run with a good branded 350w psu...
i hope this has helped gopakumar however any problems let me know
good luck mate !

What should I upgrade my HP Pavilion Elite M9150f graphics card to?

I want to buy a new graphics card for my current computer. The graphics card in my computer is the Ge Force 8500GT and it can't play any new games on good settings with a decent frame rate. So, me not knowing much, I don't know what a good graphics card that would be compatible with my computer would be. I need a graphics card for PC gaming. I want to be able to play games like Fable 3 and The Witcher 2 on pretty good settings with a 1900x1200 resolution or pretty close as that's my monitors native resolution. Also would like to be able to play Diablo 3 when it comes out.

I've searched around to try and pick one but all the weird numbering confuses me and I don't know the difference of one card or the other. Another major problem is I don't know if it would even be compatible with my computer. I should also mention that I have almost no experience with computer hardware, the only time I've switched anything inside my computer was to replace the hard drive, but that's not to say I couldn't figure out how to replace a graphics card.

So any help would be greatly appreciated, and if anyone would be so kind as to help me out and recommend a few graphics cards that'd be great.
And I'm aware that I'll most likely have to replace my PSU and in that case if someone could recommend a PSU to go with the GPU that'd be great. Also forgot to mention I have a budget of around $200 for both.

Thanks for any advice in advance and here's a url to all the info on my computer from HP:
http://h20000.www2.hp.com/bizsuppor...

Once again, thanks!

Oh, almost forgot I also need a HDMI out port as that's what I'm currently using on my computer however a DVI port would be okay as well.

Can you burn out your graphics card by connecting a hdmi cord between your pc and hd tv?

Bullshit. The cable has nothing to with the performance of you card. First off: The only thing it does is transfer the video rendered in your gfx card and putting it on your screen. Nothing more.

However, the more stress you put on your card, the higher the chance it'll burn out. For example, the resolution. The higher the resolution, the more pixels your card has to render. You might be going like, "Oh, but the HDMI cable offers a higher resolution!" It does, but again, it has nothing to do with the cable.

You should tell you friend that he was wrong, but if you buy a cheap cable, it might burn, but not your graphics card. Never.

My GPU has one HDMI output, how can I connect two monitors with (2x) HDMI inputs?

If you have two displays that are HDMI only (meaning they don’t support DVI or DisplayPort inputs) it would depend on the outputs your video card hasFirst, if at least one of those displays does at least support DisplayPort and the video card has at least one DisplayPort (regular or mini), just connect one via HDMI and the other via DisplayPort, they should be equal in video quality and you shouldn’t notice the differenceIf DisplayPort is not an option on the display, get a DisplayPort to HDMI converter (either dongle or cable) and connect it from the DisplayPort port on the card to the HDMI of the displayThe same can go for DVI and VGA, there are also HDMI converters for both, both in dongle and cable forms (using VGA for worst case scenario)a VGA to HDMI converter may require a powered converter of some sort and the resolution may be less than desired, hence worst case scenario

If your PC is switched on, could you potentially damage a graphics card when unplugging an HDMI cable?

You could damage the port if you pull the cable in a direction it is not meant to go. That damage might make the card unusable since it cannot present video output through that port anymore.It’s also possible, if you are carrying a lot of static electricity, that touching the cable sends a static charge into the computer, frying some small component somewhere, rendering it unusable. Most maufacturers are aware of this and have such cable shielding and grounding that this is less likely, but it’s a possibility.Otherwise, if you remove the cord with any degree of competence, it should have no effect on things at all.

TRENDING NEWS