Home | Community | Message Board

MagicBag Grow Bags
This site includes paid links. Please support our sponsors.


Welcome to the Shroomery Message Board! You are experiencing a small sample of what the site has to offer. Please login or register to post messages and view our exclusive members-only content. You'll gain access to additional forums, file attachments, board customizations, encrypted private messages, and much more!

Shop: Kraken Kratom Red Vein Kratom   PhytoExtractum Buy Bali Kratom Powder

Jump to first unread post Pages: < Back | 1 | 2 | 3 | 4 | Next >  [ show all ]
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23644286 - 09/14/16 10:25 AM (7 years, 4 months ago)

Another question pops in



Will my GeForce 750 @ PCI X 2 be enough to pump 1.4 protocol of data to my Tv , since the figures I gathered - 10 Gbps, are run on PCI X 3


Edited by desant (09/14/16 01:33 PM)


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23644293 - 09/14/16 10:32 AM (7 years, 4 months ago)

So come on guys, do I need to buy a whole new PC or will a new HDMI 2.0b graphics card plugged into PCI 2 will do the trick?


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23644578 - 09/14/16 12:57 PM (7 years, 4 months ago)

Basically my GeForce is designed for 1.4  UHD windows and cinema, but it's designed for PCI X 3 and I got it stuck in PCI X 2


I hope it works


--------------------


Edited by desant (09/14/16 01:28 PM)


Extras: Filter Print Post Top
InvisibleShroomismM
Space Travellin
Male User Gallery

Folding@home Statistics
Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23644724 - 09/14/16 02:02 PM (7 years, 4 months ago)

Quote:

desant said:
Yo dudes


Just found out my GeForce 750 (which I purchased few months ago to solve the heat issue with my flight sim)  has a oldish 1.4 HDMI !  :dudewtf:


I'll be getting a 4 K monitor soon and I did some research : 1.4 runs at 24 refresh rate, where's 2.0 HDMI runs around 50-60 with a better colour depth too.


So obvious choice is to get a better graphics card? Here's my question, early in the thread I said I only have a PCI 2.0 port, which works with my 3.0 PCI GeForce, I want to know will the 2.0 PCI slot act like a bottle neck for 4K devices, if I install a top of the range card? I know my 2.0 PCI doesn't juice my GeForce 750 to the max, but will 2.0 PCI provide enough bandwidth to run UHD on my TV?




Yes.

The PCI-E 2.0 bandwidth is not the issue. It has more than enough bandwidth.
As I mentioned earlier in this thread (I think?), PCI-E 3.0 is mainly just for future proofing, modern cards still aren't even maxing out the bandwidth on PCI-E 2.0 last I checked.

Your only major concern for a bottleneck with a better graphics card would be the CPU. If its an older CPU, it may not be able to keep up with a high end modern card. What CPU do you have again?


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
    #23644786 - 09/14/16 02:26 PM (7 years, 4 months ago)

My CPU is i7

Ok so u saying my GeForce should pull me through via PCI 2?

But my GeForce is still 1.4  I.e. Not optimised for 4 K gaming, because of lower FPS and like u sad it's only 2 gig ram and u need 8 u were saying?

Should I buy a more advanced card like GeForce 1080 with 8 gig of memory? That what I would do from the start IF I HAD PCI 3!!! I would simply upgrade my cards and have 3-4 more years of stable graphics/ gaming performance

But yea, my GeForce is HDMI 1.4 where's newer card, more expensive cards, are HDMI 2.0b



Anyhow, thanks for help bud



Ps.
When I bought GeForce 750 few months ago I didn't do my research, I thought all HDMI are the same, and my buddy said so too but I should have bought a HDMI 2,0b card, meh


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23644979 - 09/14/16 03:26 PM (7 years, 4 months ago)

I wonder if I should get a 1080...

I wonder how performance of that card affected by lesser PCI bridge...


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23646858 - 09/15/16 08:42 AM (7 years, 4 months ago)

Woop tee do....


Just had a chat with Nvidia tech guys and they said there not gonna be any performance issues betweenPCI 2 and PCI 3


Gonna get a HDMI 2.0b card now


I wish I knew this sooner.... Spent 110 £ on GTX 750 with HDMI 1.4 :feelsbadman:


--------------------


Extras: Filter Print Post Top
InvisibleShroomismM
Space Travellin
Male User Gallery

Folding@home Statistics
Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23652983 - 09/17/16 12:13 PM (7 years, 4 months ago)

What i7?

You could get a 1080 if you want.. don't suspect you would see a bottleneck with that with an i7 unless it's like one of the earliest first gen ones. If there is one it would probably be minimal.

You can always get a high end GPU and upgrade the CPU/Motherboard later on down the line if it becomes not enough.

Yeah you're not gonna see any performance difference with a PCIE-3.0 card in a 2.0 slot. That may change in a few years as GPUs become more and more powerful, but right now it's not an issue.
And yeah for 4k Gaming you definitely need a lot of horsepower in the GPU department, and lots of VRAM is ideal. Or especially if you want to do surround gaming at some point.


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
    #23657924 - 09/19/16 06:22 AM (7 years, 4 months ago)

I believe my i7 is 950


First or second gen, still pretty good


Thanks for help dude :smile:


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23657927 - 09/19/16 06:26 AM (7 years, 4 months ago)

While we on the subject, I noticed all new age graphics card got "display port"

A technology that's been around for years apparently.

What do u know about it?

I hear it's used for multiple monitors and its UHD refresh rate is 120 hertz compare to 60 hertz of HDMI 2.0b

Also the 4K TV I'm getting to use with my rig doesn't have a display port :shrug:


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23658185 - 09/19/16 09:00 AM (7 years, 4 months ago)

Ohh and if I have to go total upgrade way I would have to buy not just new card but a new mobo, new CPU AND new ddr4 memory (cos my current memory is ddr3)


--------------------


Extras: Filter Print Post Top
InvisibleShroomismM
Space Travellin
Male User Gallery

Folding@home Statistics
Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23667067 - 09/22/16 03:03 AM (7 years, 4 months ago)

Correct. If you upgrade to a current gen platform.

Displayport has been around for a while now, and is what you want to use for a 4k monitor.

Don't get a TV for gaming.. in general.. they tend to have high input lag.


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
    #23667504 - 09/22/16 09:25 AM (7 years, 4 months ago)

Quote:

Shroomism said:[/]

Don't get a TV for gaming.. in general.. they tend to have high input lag.





U think I haven't thought this thro ? :wink:



Lag is mostly common with LG TVs ..... Expensive ones are fine :awesome:


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
    #23667509 - 09/22/16 09:26 AM (7 years, 4 months ago)

Quote:

Shroomism said:

Displayport has been around for a while now, and is what you want to use for a 4k monitor 





Are there 4K PC monitors around?? I thought 4K is only a TV  standard


--------------------


Edited by desant (09/22/16 12:23 PM)


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23670003 - 09/23/16 03:38 AM (7 years, 4 months ago)

There are no display ports on 4K TVs


So come on, 4K computer monitor?????????????  Dude, even 1080p on a PC monitor is too much. Seriously, look at 60 inch HD tv, it looks cool, and u do need a 60 inch screen to fully appreciate high defenition. But 4 K say on a 26 inch PC monitor???????? Dude that's just wrong, u will need 60+ inch monitor to fully appreciate 4K


--------------------


Extras: Filter Print Post Top
InvisibleShroomismM
Space Travellin
Male User Gallery

Folding@home Statistics
Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23685480 - 09/28/16 05:46 AM (7 years, 4 months ago)

HDMI or Displayport. Both will be on any good monitor. Displayport supports more data. You will only find those on expensive TVs

Dude, do whatever you want. But I'm telling you most TVs have a high level of input lag which is not good for most PC gaming as well many have issues with ghosting/motion blur which you will not get with a monitor. A good monitor will be of higher quality than a good TV and have a higher pixel density. Response times, refresh rates, input lag, different resolutions... all better on monitors. Also...G-Sync.

I don't think you know what you are talking about that you need a 60 inch screen to appreciate high definition. I can appreciate high definition just fine on a huge 34" 4k monitor 2 feet away.

If you shop around you can find a TV with a low input lag.. and while it may be satisfactory for most games and a TV may be great for a nice huge screen, have fun reading text and and little UI fragments in the corner of the screen, not to mention all the above. Also many are capped at 60hz. All HDTVs have video processing which unless it has a "Game mode" or "PC mode" it adds lag.

Unless you buy a really good TV, with a low level of input lag. But even then - a "low" level of input lag on a TV in a best case scenario is like ~25ms from what I've seen (but much more commonly ~50-120+ms) where on a gaming monitor it's more like 8ms. 25ms is "manageable" and may not even be noticeable for casual gaming.. but it will make a difference in really twitch games.

TV is fine for casual gaming but for any serious gaming a monitor will always outperform. And they suck for reading text IME.
HDTVs HAVE come a long way in the past few years.. it used to a be a LOT worse.. Not saying it's absolutely terrible but I've tried it several times with several different TVs and while it's novel at first it wears off fast and cons start to show their head. I will always prefer a good monitor for PC gaming.

I mean do what you like, I'm just warning you of possible drawbacks for PC gaming.
If you do get a TV make sure you do your homework and get one with a low input lag, which isn't always easy to find because TV manufacturers usually don't publish that data. 
And make sure it supports 4:4:4 chroma

Also not sure where you get 26".. lots of 4K monitors are like 34".


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
    #23685534 - 09/28/16 06:31 AM (7 years, 4 months ago)

Just checked the specs of the TV I'm getting, game mode - 8ms!  :gethigh:


--------------------


Extras: Filter Print Post Top
InvisibleShroomismM
Space Travellin
Male User Gallery

Folding@home Statistics
Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23688680 - 09/28/16 10:05 PM (7 years, 3 months ago)

Damn that's pretty good. I didn't know they even made TVs with input lag that low now.. last I checked the BEST ones on the market were around 15-20ms.. let me guess Samsung or Sony or Panasonic?

There's the other things to be aware of too... but input lag is going to be your biggest worry with a HDTV. If that's good and it has good reviews for gaming then go for it I guess if that's what you want.

I'd much sooner steer you towards a badass 4k gaming monitor like the Dell P2715Q or the BenQ BL3201PH IPS monitor or the LG S34E790C  or the curved 34" Samsung S34E790C which is fucken amazing.. but whatever floats your boat. :shrug:

Also be aware that Adaptive Sync will NOT function on a TV. Like.... at all.
Nor G-sync... which if I was buying a top end GPU and a 4K monitor right now I would certainly be looking for a monitor that supports G-sync/freesync since it's almost become a normal feature on a lot of 4k monitors now without paying a ton extra, the tech is out of its infancy so it's not uber expensive. You're gonna be missing out man.. just don't say nobody warned you.. a monitor is gong to give you much more bang for your buck and a better overall experience IMO :shrug:
Also you are going to be limited to 60hz @ 4k with all TVs. Even if it says it's 120hz.. it's not a true 120hz.. it fakes the extra frames to give the illusion of it. Also due to the limitations of HDMI, the max refresh rate you are going to get with a TV is going to be 60hz. Want 120 FPS well too bad. You can try overclocking your TV but that's obviously at your own risk and the most I've seen people able to get is ~100hz. Skipped frames are almost guaranteed.

TVs have come a long way for sure in the past few years.. but I still say when it comes to gaming, you can put the best TV vs the best monitor head to head and the monitor will win in pretty much every category except screen size.
There's a reason pretty much every single professional PC gamer uses a monitor. If TVs were better then everyone would be using them. It's "fun" and "immersive" but there are real downsides.

If none of this bothers you in the slightest and you don't have a single fuck to give then by all means.. charge forth! I'm just trying to help a brotha out.
If you are still set on a TV then only get one of these - https://hardforum.com/threads/4k-60hz-4-4-4-hdmi-2-0-tv-database.1837209/




--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
    #23689363 - 09/29/16 06:11 AM (7 years, 3 months ago)

Interesting vid

IMO, U don't NEED a 4K resolution on a PC MONITOR!!!!! Come on, I been playing dues ex and far cry 4 on a simple 26 HD iiyama monitor for years on max res, and I'm telling u I have no need for higher than HD - 1920 - 1080 when it comes to gaming,  graphics are SUPERB as they are :shrug:


--------------------


Extras: Filter Print Post Top
Offlinedesant
Pleiadian Revolutionary
Male


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
    #23689371 - 09/29/16 06:15 AM (7 years, 3 months ago)

4 K TVs are quite different.


There's upscale of blue Ray's, there's amazing full blown soft porn on Windows desktop. Organising bits and pieces is very easy. Doing work in my DAW is a breath, and there's SOMETHING about proper UHD TVs that computer monitors lack - I was told the TV I'm getting is built to produce video performance like film directors and producers "meant" it to look like :wink:


--------------------


Edited by desant (09/29/16 06:23 AM)


Extras: Filter Print Post Top
Jump to top Pages: < Back | 1 | 2 | 3 | 4 | Next >  [ show all ]

Shop: Kraken Kratom Red Vein Kratom   PhytoExtractum Buy Bali Kratom Powder


Similar ThreadsPosterViewsRepliesLast post
* which graphics card? MetaShroom 1,459 10 12/13/03 02:53 PM
by Anonymous
* Anyone own a physics card yet? Konnrade 778 9 06/20/06 02:45 AM
by barfightlard
* replaceable, upgradeable graghics card? shroomydan 1,150 8 12/16/04 07:23 PM
by Huehuecoyotl
* Looking for in a good Video Capture/TV Tuner Card? Lana 1,257 8 07/21/03 10:02 AM
by RuNE
* Video Card Crashing FrankieN 885 4 10/23/04 03:11 AM
by FrankieN
* My Video Card Guide
( 1 2 all )
DjYoshi 3,578 25 12/05/05 09:54 AM
by barfightlard
* Nvidia Or Ati? AGP Or PCI? Help Please (About Video Cards) Shroom_Herder 891 5 01/29/05 01:08 PM
by MarioNett
* looking for a new video card Phychotron 1,732 19 11/07/05 06:51 PM
by barfightlard

Extra information
You cannot start new topics / You cannot reply to topics
HTML is disabled / BBCode is enabled
Moderator: trendal, automan, Northerner
3,151 topic views. 0 members, 1 guests and 1 web crawlers are browsing this forum.
[ Show Images Only | Sort by Score | Print Topic ]
Search this thread:

Copyright 1997-2024 Mind Media. Some rights reserved.

Generated in 0.027 seconds spending 0.007 seconds on 15 queries.