|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23689379 - 09/29/16 06:22 AM (7 years, 3 months ago) |
|
|
Also Shroomism buddy, go to your good electronics retailer and have a look at their selection of TVs
About 50 different TVs in my store. Take a look at "4K" TVs that cost under a grand....... They look good but ONLY if u don't know better!! 
Now look at TVs that span 3000-4000$ ....... The difference in quality is astonishing. Looking at picture quality of those TVs makes u want to touch and lick the screen! Whilst making cheap budget TVs look like expensive calculators
--------------------
Edited by desant (09/29/16 07:02 AM)
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23694266 - 09/30/16 02:51 PM (7 years, 3 months ago) |
|
|
Not sure if your questions have been answered but PCIe cards are backwards compatible however performance will suffer greatly. You will need to upgrade your motherboard to something that supports 3.0 or greater if you want to take full advantage of the hardware. The differences lie in the data amount of data(bandwidth) that can be pushed to and from the card. A lot of the time this isn't an issue because games and stuff don't utilize all of that potential bandwidth, usually because of bottlenecks like the CPU or RAM when pushing vertex buffers to the graphics card.
I believe the only place that will utilize it fully will be high resolution monitors and multi-monitor setups.
[edit] read a little bit of this thread and noticed your concerns over temperatures. 100c on AMD hardware is nothing. Don't worry about it. If anything gets too hot the system will shut down automagically.
--------------------
Edited by Oggy (09/30/16 02:59 PM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23696282 - 10/01/16 04:29 AM (7 years, 3 months ago) |
|
|
Quote:
Oggy said: Not sure if your questions have been answered but PCIe cards are backwards compatible however performance will suffer greatly. You will need to upgrade your motherboard to something that supports 3.0 or greater if you want to take full advantage of the hardware. The differences lie in the data amount of data(bandwidth) that can be pushed to and from the card. A lot of the time this isn't an issue because games and stuff don't utilize all of that potential bandwidth, usually because of bottlenecks like the CPU or RAM when pushing vertex buffers to the graphics card.
I believe the only place that will utilize it fully will be high resolution monitors and multi-monitor setups.
[edit] read a little bit of this thread and noticed your concerns over temperatures. 100c on AMD hardware is nothing. Don't worry about it. If anything gets too hot the system will shut down automagically.
Dude, that is exact opposite of what everyone said in this thread!
Shroomism here said it won't matter much since cards don't utilise full PCIe 3.0 bandwidth yet
Same thing was said to me on nvidia message board AND same thing was said by a nvidia technician in a online chat.
PLEASE don't spread incorrect information, this is very important
--------------------
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23697652 - 10/01/16 04:21 PM (7 years, 3 months ago) |
|
|
/me puts the gloves on
Alright, time for some learnin's.
In the modern graphics pipeline(both OpenGL and DirectX) the program pushes state changes to the gpu in the form of vertex object buffers(eg, VBO/VBA). The obvious bottleneck here is the CPU and RAM unless the GPU is very old. Pushing state changes to a object buffer relies heavily on the CPU! Then you have the framebuffer which contains the entire frame and possibly more, which gets sent off to be display which in turn must be translated by special driver hardware in the TV or monitor.
Bandwidth is a combination of pushing the vertex buffers/state changes to the gpu, recalling the data by the program and sending it to the monitor/TV for processing by the hardware driver. It's not really an issue at all for a GPU though, like I said. And here is why. A single 4k RGBA 8bpc frame is only around 32MB!
Basically, pcie 2.0 is absolutely fine for 4k streaming, even for games. It's all about future-proofing really.
It sounds like you are scolding me for saying the exact same thing everyone else has said? The part where I say performance will suffer greatly is absolutely true because you cannot utilize the bandwidth pcie 3.0 offers on a 2.0 board. Assuming you can even reach that bandwidth cap.
--------------------
Edited by Oggy (10/01/16 04:29 PM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23697894 - 10/01/16 05:59 PM (7 years, 3 months ago) |
|
|
Assuming u can reach the bandwidth cap....
But people say modern cards do not use full bandwidth of PCI 3 !! So PCI 2 is sufficient for. Now. So really it's an issue with "future proofing" more than anything else!!
I wonder what Shroomism will say
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23698246 - 10/01/16 07:57 PM (7 years, 3 months ago) |
|
|
Quote:
Oggy said: /me puts the gloves on
Alright, time for some learnin's.
In the modern graphics pipeline(both OpenGL and DirectX) the program pushes state changes to the gpu in the form of vertex object buffers(eg, VBO/VBA). The obvious bottleneck here is the CPU and RAM unless the GPU is very old. Pushing state changes to a object buffer relies heavily on the CPU! Then you have the framebuffer which contains the entire frame and possibly more, which gets sent off to be display which in turn must be translated by special driver hardware in the TV or monitor.
Bandwidth is a combination of pushing the vertex buffers/state changes to the gpu, recalling the data by the program and sending it to the monitor/TV for processing by the hardware driver. It's not really an issue at all for a GPU though, like I said. And here is why. A single 4k RGBA 8bpc frame is only around 32MB!
Basically, pcie 2.0 is absolutely fine for 4k streaming, even for games. It's all about future-proofing really.
It sounds like you are scolding me for saying the exact same thing everyone else has said? The part where I say performance will suffer greatly is absolutely true because you cannot utilize the bandwidth pcie 3.0 offers on a 2.0 board. Assuming you can even reach that bandwidth cap.
The point is that almost NOTHING right now uses up all the bandwidth on PCI-e 2.0 anyway, not games anyway. Maybe a handful of specialized programs.. Thus no bottleneck, since the data pushes through 2.0 just fine.. all the tests I have seen of PCI-e 3.0 GPU on a PCI-e 2.0 slot, vs a 3.0 slot... the difference is like 0-4%, or within the margin for error in statistic and not even noticeable.. certainly not a major bottleneck by any stretch.
Yes it's more for future proofing but as it stands now there really isn't any noticeable loss if you use a 3.0 card in a 2.0 slot for 99% of applications. Will that likely change in the future as games and stuff evolve and become even more uber powerful? Almost without a doubt. But right now not really an issue.
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23698258 - 10/01/16 08:02 PM (7 years, 3 months ago) |
|
|

I hope u right
--------------------
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23698559 - 10/01/16 10:10 PM (7 years, 3 months ago) |
|
|
That's exactly what I was saying.  Though, bottlenecks are definitely real issues elsewhere in hardware and they can carry over to the GPU as I stated in my previous post hence the term bottleneck.
 https://en.wikipedia.org/wiki/Bottleneck_(engineering)
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23699449 - 10/02/16 08:28 AM (7 years, 3 months ago) |
|
|
Quote:
Oggy said: That's exactly what I was saying. 
Ahh no, u said "performance will suffer greatly".
--------------------
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23700161 - 10/02/16 01:03 PM (7 years, 3 months ago) |
|
|
Wow
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23700431 - 10/02/16 02:18 PM (7 years, 3 months ago) |
|
|
Well... you did
Quote:
Oggy said: Not sure if your questions have been answered but PCIe cards are backwards compatible however performance will suffer greatly.
Its all semantics at this point lol.. I understand your point but I think you can see where the confusion is from. I think you maybe just didn't word it right.. I was confused by this statement too at first In the future, performance probably will suffer greatly. Like using a GTX 1080 on a Pentium4 or something, that CPU will be the bottleneck of bottlenecks. But right now, PCI-e 2.0 isn't really a bottleneck.
I worked for one of the major GPU companies for a while.. don't think I've ever seen a PCI-e slot be a major bottleneck... the CPU will generally be the bottleneck long before that. It was the same thing back when PCI-e 1.0 was the standard and 2.0 cards started becoming popular. We'd have people asking all the time if it would work.. yes its backwards compatible.. no you probably won't see a major bottleneck, and if you do.. it's usually the CPU that can't keep up with the GPU Now with modern systems its usually the mechanical HDD that is the major bottleneck if that's your primary
However if I was building a system right now I would most certainly get a 3.0 mobo for future-proofing. Who knows how long before it starts to become a big issue. But games/software and hardware are becoming more and more powerful every day. So it's a good idea.
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23703954 - 10/03/16 02:50 PM (7 years, 3 months ago) |
|
|
Dunno how this thread is still alive, but everything Shroomism has said is true. If you're experiencing a bottleneck, your biggest issue from what I recall of your hardware is that you have a first gen i7...... The second gen sandy was like an enormous 33% jump in performance.... Now sandy is showing its age a little, but it's actually not all that bad, just no longer in the top tier... AMD doesn't even have anything close to an Ivy Bridge processor, and that's around the time PCI-3.0 motherboards started hitting the market.... The reality is if you upgrade to modern day processor with any decent board and a mid range nvidia card you will have PCI-3.0.... Really the only future proofing as far as I'm concerned is being able to swap out a faster card if you own i5 or i7 rig right now. Get a nice PSU that can handle a power hungry card so in case you want to go for a dope ass graphics card some day down the line it's as easy as taking your old one out and popping your new one in.
Don't even think about 4k gaming, fuck, it takes a reasonable amount of juice just to get QHD (1440p) gaming running smooth as butter.... My GTX 680 can't do it on higher end games without usually knocking down a setting or two but still plays most games at 1080p smooth as butter with all the knobs turned up....... That's just the reality of pushing 60% more pixels to get to 1440p, pushing more pixels to get to 4k is just silly imho..... unless you're a seriously rich bitch, but even then I'd rather just have smooth as butter performance.... I will update when a game I want to devote endless hours to comes out... Right now I'm kind of looking at Obduction, made by the makers of Myst and Riven, looks like a pretty fun world to explore around with lots of eye candy, some oldschool puzzle solving.
Oh, and the expensive TVs you are likely thinking of are OLED, those fuckers pop like nothing I've ever seen before. I'm sure there's a 4k OLED TV by Samsung by now, but meh, I'd love to get my hands on a 1080p OLED TV without breaking the bank.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23705724 - 10/04/16 04:20 AM (7 years, 3 months ago) |
|
|
What do u mean "those fuckers pop like nothing"?
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23705831 - 10/04/16 06:28 AM (7 years, 3 months ago) |
|
|
This is the TV I like: http://www.trustedreviews.com/panasonic-tx-65dx902-review
http://www.panasonic.com/uk/consumer/viera-televisions/led/tx-65dx902b.html
As u can see it is not "OLED" TV but some thing called "honeycomb" back light technology ....
It might not be OLED but they say it's the best TV around
Of course I haven't seen it IRL yet but I'm going to drive to the nearest location to check it out.
--------------------
Edited by desant (10/04/16 06:35 AM)
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23705860 - 10/04/16 06:55 AM (7 years, 3 months ago) |
|
|
black quality, contrast, color accuracy, moving resolution (sharpness), off-axis performance, screen uniformity..... hdr/wcg performance, overall day (high ambient light) overall night (low ambient light)
I think LG probably makes the best TV on the market atm..... check out the LG OLED65G6P
pricey motherfucker...
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23705942 - 10/04/16 07:50 AM (7 years, 3 months ago) |
|
|
I had a 60' LG plasma TV 2 years ago. Out of all market leaders (Sony, Panasonic, Samsung, LG) LG I hear has the worst input lag. Also there are unconfirmed reports of "cancer bots" installed in them , and they have a history of even messing with your WiFi and shit
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23705971 - 10/04/16 08:05 AM (7 years, 3 months ago) |
|
|
I meant sheer picture quality, anyways every brand makes really high quality tvs and really poor tvs, it's all about the model..... some brands may have better customer support when it comes to warranties than others, that's something to factor in..... Input lag is going to be determined largely by the technology of the display, for example IPS displays generally have worse input lag than a TN display, but again the screens pop..... A competitive gamer should just get a TN display...... I'm sure LG has had a bad batch of IPS displays that maybe had some image persistence, perhaps that's what you're talking about with cancer blots..... that was fixed in a second revision or so on.... It's always a good reason not to be on the bleeding edge of tech, you pay to beta test somebody else's shit....... Second or third gen and the problems are addressed. But in terms of screen quality, you have to factor in LG & Samsung are making epic fuck tons of screens for laptops and monitors of all brands... It's not like Dell manufactures a screen with their Ultrasharps, they use LG panels... There's also just backlight bleed and IPS glow and shit to deal with that can look pretty bad, maybe that's what you meant by blot.... occasionally a cluster of dead pixels might get by. Sometimes you have to just complain and have them replace your display until you get a "good" one, it's really a luck of the draw kind of thing.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23706182 - 10/04/16 09:38 AM (7 years, 3 months ago) |
|
|
LG might be making computer monitors but than doesn't mean there TVs are up to scratch 
But I agree with ya that both brands have good TVs and bad ones
Me and my buddy here agree upon this:
Best TV brands
1. Panasonic 2. Sony* 3. Samsung 4. LG
Now like I mentioned dude at the store said Sony are the best BECAUSE all movie making equipment is done on Sony hardware - cameras, sounds, post processing and mastering.... it's all Sony, apparently, as logic says sony TVs will fit right in and outperform others
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23706242 - 10/04/16 09:57 AM (7 years, 3 months ago) |
|
|
I'm just talking about pure picture quality here, for watching films.... None of the Smart TV features (I mean get a HTPC or something) or Input Lag for gaming wasn't on my mind.... OLED TVs look fucking amazing, you just gotta go see one in person, especially if you can see the model I mentioned...... Anyways I pay attention to AVS forums and the Values Electronics TV Shootout are some serious fucking TV snobs so, I really think it's worth paying attention to, for three consecutive years OLED has taken the king of the crown....... I just mean look at one properly calibrated in person and you be the judge is all I'm saying. Panasonics were way better years ago when plasma was just bringing best contrast in particular, I mean they had nasty black levels that an LCD screen just couldn't perform with....
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23706253 - 10/04/16 10:01 AM (7 years, 3 months ago) |
|
|
Gonna go to the store tomorrow and see what OLED they got
--------------------
|
|