|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
graphics card pcie 3.0 vs 2.0
#23454998 - 07/19/16 04:48 AM (7 years, 6 months ago) |
|
|
I been told to upgrade my graphics card
Only i am not sure if pci. 3 will be fine on a old pci 2 mobo
The tech guy at my shop said pci 2 and 3 are backward compatible. What i want to know if performance will be affected?
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23455151 - 07/19/16 06:53 AM (7 years, 6 months ago) |
|
|
From what I gather online performance issues are insignificant between 2 and 3
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23456022 - 07/19/16 12:11 PM (7 years, 6 months ago) |
|
|
Yes it's backwards compatible. No you wont have a performance issue. You just wont get PCI-e 3.0 bandwidth, you'll be limited to 2.0.
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23456176 - 07/19/16 01:02 PM (7 years, 6 months ago) |
|
|
They say pci 3 is twice faster than pci 2
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23456437 - 07/19/16 02:30 PM (7 years, 6 months ago) |
|
|
Theoretically. Twice as much potential bandwidth and double the data transfer rate. But we aren't there yet. That's to leave room for the future. That's what the slot is CAPABLE of, not what it's pushing.
Most tests I've seen the with current tech, the performance difference between a PCIE 3.0 GPU on PCIe 3.0 slot vs 2.0 is roughly 5-10%. Very minimal gains on 3.0.
In the future that will change. But current video cards aren't even close to powerful enough to maxing out the bandwidth. As future video cards continue getting more and more powerful, then we will start to see a much bigger performance difference.
Right now it's not really an issue. But if building a new system, get PCIE 3.0 for future proofing.
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23456494 - 07/19/16 02:53 PM (7 years, 6 months ago) |
|
|
Wow thanks dude, I didn't know that!
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23458750 - 07/20/16 08:09 AM (7 years, 6 months ago) |
|
|
While we are on the subject, what's a good free 3D card benchmark program??
I am going to measure my Radeon v GeForce to see what's it like
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23459049 - 07/20/16 09:52 AM (7 years, 6 months ago) |
|
|
Never mind , I will buy 3dmark
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23459161 - 07/20/16 10:32 AM (7 years, 6 months ago) |
|
|
Got 3DMark
Looks impressive
Got 8518 score on a "Sky Diver" benchmark.
Thing that was of concern - GPU temp reached 80 c !

Could this be the reason X Plane doesnt like my Radeon??
Might well be! Notice GPU temp STEADILY climbing!! Give it an hour it will touch 90-100!
--------------------
Edited by desant (07/20/16 11:40 AM)
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23459914 - 07/20/16 03:24 PM (7 years, 6 months ago) |
|
|
Btw, do I need to un install old graphics card before booting PC with the new card?
--------------------
|
Ythan
ᕕ( ᐛ )ᕗ


Registered: 08/08/97
Posts: 18,774
Loc: NY/MA/VT Borderlands
Last seen: 17 minutes, 56 seconds
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23459994 - 07/20/16 03:58 PM (7 years, 6 months ago) |
|
|
Nah go ahead and leave it installed for now, it won't cause any problems and if you have issues with the new card and need to switch back to the old one it'll simplify things. Once you have the new card working, you may want to uninstall the old drivers if they're from a different manufacturer to get rid of the associated software and avoid unnecessary updates. Ie. there's no need to have the AMD Control Panel still installed if you're using an nVidia card. However, technically it shouldn't hurt to just leave it.
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Ythan]
#23460095 - 07/20/16 04:35 PM (7 years, 6 months ago) |
|
|
Thnx for the tip dude
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23460792 - 07/20/16 09:10 PM (7 years, 6 months ago) |
|
|
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23462997 - 07/21/16 03:39 PM (7 years, 6 months ago) |
|
|
I told you heat issues were the #1 concern w/ that game crashing just by looking at its troubleshooting FAQ, temps in excess of 80c pushing 100c could very well be bad news if the game keeps your GPU at 100% load..... The NVIDIA cards generally have all around lower power consumption and keep cooler... A new video card is a good bet, otherwise setting up a fan curve or something to that effect may keep things cooler.... Of course making sure you have a really solid PSU to go with your GPU is always a wise idea....
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23465219 - 07/22/16 08:31 AM (7 years, 6 months ago) |
|
|
Right 
I just spent few hours flying with my new card
It runs smoothly around 57-58 c with brief peak temp of 62C
All good
I figured if your GPU and CPU run under 70 you are good, am I right?
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23465232 - 07/22/16 08:35 AM (7 years, 6 months ago) |
|
|
Correction, my GPU is now at 65-68 C !!
Edit now it's 60-63!
--------------------
Edited by desant (07/22/16 09:03 AM)
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23465627 - 07/22/16 10:59 AM (7 years, 6 months ago) |
|
|
Danger Danger! High Voltage!
Check out these stats:

Why is CPUTIN 105 C??
What IS cputin?? I can imagine its been like this for years so no problem
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23466569 - 07/22/16 04:05 PM (7 years, 6 months ago) |
|
|
Well, the The TCASE is 67.9°C on your processor, the CPUTIN is showing you the current value, the min, and the max of basically a sensor on your motherboard near your cpu that is picking up its temperature since you began running CPUID..... It really shouldn't exceed 67.9c.... I have reason to believe that sensor might be faulty or the software is just being flaky, because your core temps have a good spread and don't seem dangerously high. Your core temps are read from a sensor within the CPU itself and are generally considered reliable..... I'd double check with some other software though, maybe real temp.... I'm pretty sure your computer would just shut off to protect itself if it really hit that temp. Had you been gaming with CPUID running in the background or picking up these #s or using prime95 or anything? Also, don't run multiple temp reading software simultaneously, close down CPUID before running anything like Real Temp or Core Temp respectively, just use one. If it was a custom build, your motherboard may have came with its own software for reading its sensors so definitely make sure you don't have anything like that running in the background.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23644021 - 09/14/16 08:06 AM (7 years, 4 months ago) |
|
|
Yo dudes
Just found out my GeForce 750 (which I purchased few months ago to solve the heat issue with my flight sim) has a oldish 1.4 HDMI ! 
I'll be getting a 4 K monitor soon and I did some research : 1.4 runs at 24 refresh rate, where's 2.0 HDMI runs around 50-60 with a better colour depth too.
So obvious choice is to get a better graphics card? Here's my question, early in the thread I said I only have a PCI 2.0 port, which works with my 3.0 PCI GeForce, I want to know will the 2.0 PCI slot act like a bottle neck for 4K devices, if I install a top of the range card? I know my 2.0 PCI doesn't juice my GeForce 750 to the max, but will 2.0 PCI provide enough bandwidth to run UHD on my TV?
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23644280 - 09/14/16 10:20 AM (7 years, 4 months ago) |
|
|
This is what I gathered so far:
HDMI 1.0 was designed for High Defenition TV
HDMI 1.4 was designed for 4K / UHD. ( 24 FPS )
HDMI 2.0 was designed to accommodate 4K high FPS gaming ( 50 - 60 FPS )
At the end of the day DVDS and Blue Ray's offer 24 FPS, so 1.4 graphics card should be enough for now
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23644286 - 09/14/16 10:25 AM (7 years, 4 months ago) |
|
|
Another question pops in
Will my GeForce 750 @ PCI X 2 be enough to pump 1.4 protocol of data to my Tv , since the figures I gathered - 10 Gbps, are run on PCI X 3
Edited by desant (09/14/16 01:33 PM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23644293 - 09/14/16 10:32 AM (7 years, 4 months ago) |
|
|
So come on guys, do I need to buy a whole new PC or will a new HDMI 2.0b graphics card plugged into PCI 2 will do the trick?
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23644578 - 09/14/16 12:57 PM (7 years, 4 months ago) |
|
|
Basically my GeForce is designed for 1.4 UHD windows and cinema, but it's designed for PCI X 3 and I got it stuck in PCI X 2
I hope it works
--------------------
Edited by desant (09/14/16 01:28 PM)
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23644724 - 09/14/16 02:02 PM (7 years, 4 months ago) |
|
|
Quote:
desant said: Yo dudes
Just found out my GeForce 750 (which I purchased few months ago to solve the heat issue with my flight sim) has a oldish 1.4 HDMI ! 
I'll be getting a 4 K monitor soon and I did some research : 1.4 runs at 24 refresh rate, where's 2.0 HDMI runs around 50-60 with a better colour depth too.
So obvious choice is to get a better graphics card? Here's my question, early in the thread I said I only have a PCI 2.0 port, which works with my 3.0 PCI GeForce, I want to know will the 2.0 PCI slot act like a bottle neck for 4K devices, if I install a top of the range card? I know my 2.0 PCI doesn't juice my GeForce 750 to the max, but will 2.0 PCI provide enough bandwidth to run UHD on my TV?
Yes.
The PCI-E 2.0 bandwidth is not the issue. It has more than enough bandwidth. As I mentioned earlier in this thread (I think?), PCI-E 3.0 is mainly just for future proofing, modern cards still aren't even maxing out the bandwidth on PCI-E 2.0 last I checked.
Your only major concern for a bottleneck with a better graphics card would be the CPU. If its an older CPU, it may not be able to keep up with a high end modern card. What CPU do you have again?
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23644786 - 09/14/16 02:26 PM (7 years, 4 months ago) |
|
|
My CPU is i7
Ok so u saying my GeForce should pull me through via PCI 2?
But my GeForce is still 1.4 I.e. Not optimised for 4 K gaming, because of lower FPS and like u sad it's only 2 gig ram and u need 8 u were saying?
Should I buy a more advanced card like GeForce 1080 with 8 gig of memory? That what I would do from the start IF I HAD PCI 3!!! I would simply upgrade my cards and have 3-4 more years of stable graphics/ gaming performance
But yea, my GeForce is HDMI 1.4 where's newer card, more expensive cards, are HDMI 2.0b
Anyhow, thanks for help bud
Ps. When I bought GeForce 750 few months ago I didn't do my research, I thought all HDMI are the same, and my buddy said so too but I should have bought a HDMI 2,0b card, meh
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23644979 - 09/14/16 03:26 PM (7 years, 4 months ago) |
|
|
I wonder if I should get a 1080...
I wonder how performance of that card affected by lesser PCI bridge...
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23646858 - 09/15/16 08:42 AM (7 years, 4 months ago) |
|
|
Woop tee do....
Just had a chat with Nvidia tech guys and they said there not gonna be any performance issues betweenPCI 2 and PCI 3
Gonna get a HDMI 2.0b card now
I wish I knew this sooner.... Spent 110 £ on GTX 750 with HDMI 1.4
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23652983 - 09/17/16 12:13 PM (7 years, 4 months ago) |
|
|
What i7?
You could get a 1080 if you want.. don't suspect you would see a bottleneck with that with an i7 unless it's like one of the earliest first gen ones. If there is one it would probably be minimal.
You can always get a high end GPU and upgrade the CPU/Motherboard later on down the line if it becomes not enough.
Yeah you're not gonna see any performance difference with a PCIE-3.0 card in a 2.0 slot. That may change in a few years as GPUs become more and more powerful, but right now it's not an issue. And yeah for 4k Gaming you definitely need a lot of horsepower in the GPU department, and lots of VRAM is ideal. Or especially if you want to do surround gaming at some point.
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23657924 - 09/19/16 06:22 AM (7 years, 4 months ago) |
|
|
I believe my i7 is 950
First or second gen, still pretty good
Thanks for help dude
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23657927 - 09/19/16 06:26 AM (7 years, 4 months ago) |
|
|
While we on the subject, I noticed all new age graphics card got "display port"
A technology that's been around for years apparently.
What do u know about it?
I hear it's used for multiple monitors and its UHD refresh rate is 120 hertz compare to 60 hertz of HDMI 2.0b
Also the 4K TV I'm getting to use with my rig doesn't have a display port
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23658185 - 09/19/16 09:00 AM (7 years, 4 months ago) |
|
|
Ohh and if I have to go total upgrade way I would have to buy not just new card but a new mobo, new CPU AND new ddr4 memory (cos my current memory is ddr3)
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23667067 - 09/22/16 03:03 AM (7 years, 4 months ago) |
|
|
Correct. If you upgrade to a current gen platform.
Displayport has been around for a while now, and is what you want to use for a 4k monitor.
Don't get a TV for gaming.. in general.. they tend to have high input lag.
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23667504 - 09/22/16 09:25 AM (7 years, 4 months ago) |
|
|
Quote:
Shroomism said:[/]
Don't get a TV for gaming.. in general.. they tend to have high input lag.
U think I haven't thought this thro ? 
Lag is mostly common with LG TVs ..... Expensive ones are fine
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23667509 - 09/22/16 09:26 AM (7 years, 4 months ago) |
|
|
Quote:
Shroomism said:
Displayport has been around for a while now, and is what you want to use for a 4k monitor
Are there 4K PC monitors around?? I thought 4K is only a TV standard
--------------------
Edited by desant (09/22/16 12:23 PM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23670003 - 09/23/16 03:38 AM (7 years, 4 months ago) |
|
|
There are no display ports on 4K TVs
So come on, 4K computer monitor????????????? Dude, even 1080p on a PC monitor is too much. Seriously, look at 60 inch HD tv, it looks cool, and u do need a 60 inch screen to fully appreciate high defenition. But 4 K say on a 26 inch PC monitor???????? Dude that's just wrong, u will need 60+ inch monitor to fully appreciate 4K
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23685480 - 09/28/16 05:46 AM (7 years, 4 months ago) |
|
|
HDMI or Displayport. Both will be on any good monitor. Displayport supports more data. You will only find those on expensive TVs
Dude, do whatever you want. But I'm telling you most TVs have a high level of input lag which is not good for most PC gaming as well many have issues with ghosting/motion blur which you will not get with a monitor. A good monitor will be of higher quality than a good TV and have a higher pixel density. Response times, refresh rates, input lag, different resolutions... all better on monitors. Also...G-Sync.
I don't think you know what you are talking about that you need a 60 inch screen to appreciate high definition. I can appreciate high definition just fine on a huge 34" 4k monitor 2 feet away.
If you shop around you can find a TV with a low input lag.. and while it may be satisfactory for most games and a TV may be great for a nice huge screen, have fun reading text and and little UI fragments in the corner of the screen, not to mention all the above. Also many are capped at 60hz. All HDTVs have video processing which unless it has a "Game mode" or "PC mode" it adds lag.
Unless you buy a really good TV, with a low level of input lag. But even then - a "low" level of input lag on a TV in a best case scenario is like ~25ms from what I've seen (but much more commonly ~50-120+ms) where on a gaming monitor it's more like 8ms. 25ms is "manageable" and may not even be noticeable for casual gaming.. but it will make a difference in really twitch games.
TV is fine for casual gaming but for any serious gaming a monitor will always outperform. And they suck for reading text IME. HDTVs HAVE come a long way in the past few years.. it used to a be a LOT worse.. Not saying it's absolutely terrible but I've tried it several times with several different TVs and while it's novel at first it wears off fast and cons start to show their head. I will always prefer a good monitor for PC gaming.
I mean do what you like, I'm just warning you of possible drawbacks for PC gaming. If you do get a TV make sure you do your homework and get one with a low input lag, which isn't always easy to find because TV manufacturers usually don't publish that data. And make sure it supports 4:4:4 chroma
Also not sure where you get 26".. lots of 4K monitors are like 34".
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23685534 - 09/28/16 06:31 AM (7 years, 4 months ago) |
|
|
Just checked the specs of the TV I'm getting, game mode - 8ms!
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23688680 - 09/28/16 10:05 PM (7 years, 3 months ago) |
|
|
Damn that's pretty good. I didn't know they even made TVs with input lag that low now.. last I checked the BEST ones on the market were around 15-20ms.. let me guess Samsung or Sony or Panasonic?
There's the other things to be aware of too... but input lag is going to be your biggest worry with a HDTV. If that's good and it has good reviews for gaming then go for it I guess if that's what you want.
I'd much sooner steer you towards a badass 4k gaming monitor like the Dell P2715Q or the BenQ BL3201PH IPS monitor or the LG S34E790C or the curved 34" Samsung S34E790C which is fucken amazing.. but whatever floats your boat. 
Also be aware that Adaptive Sync will NOT function on a TV. Like.... at all. Nor G-sync... which if I was buying a top end GPU and a 4K monitor right now I would certainly be looking for a monitor that supports G-sync/freesync since it's almost become a normal feature on a lot of 4k monitors now without paying a ton extra, the tech is out of its infancy so it's not uber expensive. You're gonna be missing out man.. just don't say nobody warned you.. a monitor is gong to give you much more bang for your buck and a better overall experience IMO Also you are going to be limited to 60hz @ 4k with all TVs. Even if it says it's 120hz.. it's not a true 120hz.. it fakes the extra frames to give the illusion of it. Also due to the limitations of HDMI, the max refresh rate you are going to get with a TV is going to be 60hz. Want 120 FPS well too bad. You can try overclocking your TV but that's obviously at your own risk and the most I've seen people able to get is ~100hz. Skipped frames are almost guaranteed.
TVs have come a long way for sure in the past few years.. but I still say when it comes to gaming, you can put the best TV vs the best monitor head to head and the monitor will win in pretty much every category except screen size. There's a reason pretty much every single professional PC gamer uses a monitor. If TVs were better then everyone would be using them. It's "fun" and "immersive" but there are real downsides.
If none of this bothers you in the slightest and you don't have a single fuck to give then by all means.. charge forth! I'm just trying to help a brotha out. If you are still set on a TV then only get one of these - https://hardforum.com/threads/4k-60hz-4-4-4-hdmi-2-0-tv-database.1837209/
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23689363 - 09/29/16 06:11 AM (7 years, 3 months ago) |
|
|
Interesting vid
IMO, U don't NEED a 4K resolution on a PC MONITOR!!!!! Come on, I been playing dues ex and far cry 4 on a simple 26 HD iiyama monitor for years on max res, and I'm telling u I have no need for higher than HD - 1920 - 1080 when it comes to gaming, graphics are SUPERB as they are
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23689371 - 09/29/16 06:15 AM (7 years, 3 months ago) |
|
|
4 K TVs are quite different.
There's upscale of blue Ray's, there's amazing full blown soft porn on Windows desktop. Organising bits and pieces is very easy. Doing work in my DAW is a breath, and there's SOMETHING about proper UHD TVs that computer monitors lack - I was told the TV I'm getting is built to produce video performance like film directors and producers "meant" it to look like
--------------------
Edited by desant (09/29/16 06:23 AM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23689379 - 09/29/16 06:22 AM (7 years, 3 months ago) |
|
|
Also Shroomism buddy, go to your good electronics retailer and have a look at their selection of TVs
About 50 different TVs in my store. Take a look at "4K" TVs that cost under a grand....... They look good but ONLY if u don't know better!! 
Now look at TVs that span 3000-4000$ ....... The difference in quality is astonishing. Looking at picture quality of those TVs makes u want to touch and lick the screen! Whilst making cheap budget TVs look like expensive calculators
--------------------
Edited by desant (09/29/16 07:02 AM)
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23694266 - 09/30/16 02:51 PM (7 years, 3 months ago) |
|
|
Not sure if your questions have been answered but PCIe cards are backwards compatible however performance will suffer greatly. You will need to upgrade your motherboard to something that supports 3.0 or greater if you want to take full advantage of the hardware. The differences lie in the data amount of data(bandwidth) that can be pushed to and from the card. A lot of the time this isn't an issue because games and stuff don't utilize all of that potential bandwidth, usually because of bottlenecks like the CPU or RAM when pushing vertex buffers to the graphics card.
I believe the only place that will utilize it fully will be high resolution monitors and multi-monitor setups.
[edit] read a little bit of this thread and noticed your concerns over temperatures. 100c on AMD hardware is nothing. Don't worry about it. If anything gets too hot the system will shut down automagically.
--------------------
Edited by Oggy (09/30/16 02:59 PM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23696282 - 10/01/16 04:29 AM (7 years, 3 months ago) |
|
|
Quote:
Oggy said: Not sure if your questions have been answered but PCIe cards are backwards compatible however performance will suffer greatly. You will need to upgrade your motherboard to something that supports 3.0 or greater if you want to take full advantage of the hardware. The differences lie in the data amount of data(bandwidth) that can be pushed to and from the card. A lot of the time this isn't an issue because games and stuff don't utilize all of that potential bandwidth, usually because of bottlenecks like the CPU or RAM when pushing vertex buffers to the graphics card.
I believe the only place that will utilize it fully will be high resolution monitors and multi-monitor setups.
[edit] read a little bit of this thread and noticed your concerns over temperatures. 100c on AMD hardware is nothing. Don't worry about it. If anything gets too hot the system will shut down automagically.
Dude, that is exact opposite of what everyone said in this thread!
Shroomism here said it won't matter much since cards don't utilise full PCIe 3.0 bandwidth yet
Same thing was said to me on nvidia message board AND same thing was said by a nvidia technician in a online chat.
PLEASE don't spread incorrect information, this is very important
--------------------
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23697652 - 10/01/16 04:21 PM (7 years, 3 months ago) |
|
|
/me puts the gloves on
Alright, time for some learnin's.
In the modern graphics pipeline(both OpenGL and DirectX) the program pushes state changes to the gpu in the form of vertex object buffers(eg, VBO/VBA). The obvious bottleneck here is the CPU and RAM unless the GPU is very old. Pushing state changes to a object buffer relies heavily on the CPU! Then you have the framebuffer which contains the entire frame and possibly more, which gets sent off to be display which in turn must be translated by special driver hardware in the TV or monitor.
Bandwidth is a combination of pushing the vertex buffers/state changes to the gpu, recalling the data by the program and sending it to the monitor/TV for processing by the hardware driver. It's not really an issue at all for a GPU though, like I said. And here is why. A single 4k RGBA 8bpc frame is only around 32MB!
Basically, pcie 2.0 is absolutely fine for 4k streaming, even for games. It's all about future-proofing really.
It sounds like you are scolding me for saying the exact same thing everyone else has said? The part where I say performance will suffer greatly is absolutely true because you cannot utilize the bandwidth pcie 3.0 offers on a 2.0 board. Assuming you can even reach that bandwidth cap.
--------------------
Edited by Oggy (10/01/16 04:29 PM)
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23697894 - 10/01/16 05:59 PM (7 years, 3 months ago) |
|
|
Assuming u can reach the bandwidth cap....
But people say modern cards do not use full bandwidth of PCI 3 !! So PCI 2 is sufficient for. Now. So really it's an issue with "future proofing" more than anything else!!
I wonder what Shroomism will say
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23698246 - 10/01/16 07:57 PM (7 years, 3 months ago) |
|
|
Quote:
Oggy said: /me puts the gloves on
Alright, time for some learnin's.
In the modern graphics pipeline(both OpenGL and DirectX) the program pushes state changes to the gpu in the form of vertex object buffers(eg, VBO/VBA). The obvious bottleneck here is the CPU and RAM unless the GPU is very old. Pushing state changes to a object buffer relies heavily on the CPU! Then you have the framebuffer which contains the entire frame and possibly more, which gets sent off to be display which in turn must be translated by special driver hardware in the TV or monitor.
Bandwidth is a combination of pushing the vertex buffers/state changes to the gpu, recalling the data by the program and sending it to the monitor/TV for processing by the hardware driver. It's not really an issue at all for a GPU though, like I said. And here is why. A single 4k RGBA 8bpc frame is only around 32MB!
Basically, pcie 2.0 is absolutely fine for 4k streaming, even for games. It's all about future-proofing really.
It sounds like you are scolding me for saying the exact same thing everyone else has said? The part where I say performance will suffer greatly is absolutely true because you cannot utilize the bandwidth pcie 3.0 offers on a 2.0 board. Assuming you can even reach that bandwidth cap.
The point is that almost NOTHING right now uses up all the bandwidth on PCI-e 2.0 anyway, not games anyway. Maybe a handful of specialized programs.. Thus no bottleneck, since the data pushes through 2.0 just fine.. all the tests I have seen of PCI-e 3.0 GPU on a PCI-e 2.0 slot, vs a 3.0 slot... the difference is like 0-4%, or within the margin for error in statistic and not even noticeable.. certainly not a major bottleneck by any stretch.
Yes it's more for future proofing but as it stands now there really isn't any noticeable loss if you use a 3.0 card in a 2.0 slot for 99% of applications. Will that likely change in the future as games and stuff evolve and become even more uber powerful? Almost without a doubt. But right now not really an issue.
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23698258 - 10/01/16 08:02 PM (7 years, 3 months ago) |
|
|

I hope u right
--------------------
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23698559 - 10/01/16 10:10 PM (7 years, 3 months ago) |
|
|
That's exactly what I was saying.  Though, bottlenecks are definitely real issues elsewhere in hardware and they can carry over to the GPU as I stated in my previous post hence the term bottleneck.
 https://en.wikipedia.org/wiki/Bottleneck_(engineering)
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23699449 - 10/02/16 08:28 AM (7 years, 3 months ago) |
|
|
Quote:
Oggy said: That's exactly what I was saying. 
Ahh no, u said "performance will suffer greatly".
--------------------
|
Oggy
Stranger Danger


Registered: 12/05/14
Posts: 1,276
Loc: Planet Remulak
Last seen: 6 months, 29 days
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23700161 - 10/02/16 01:03 PM (7 years, 3 months ago) |
|
|
Wow
--------------------
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Oggy]
#23700431 - 10/02/16 02:18 PM (7 years, 3 months ago) |
|
|
Well... you did
Quote:
Oggy said: Not sure if your questions have been answered but PCIe cards are backwards compatible however performance will suffer greatly.
Its all semantics at this point lol.. I understand your point but I think you can see where the confusion is from. I think you maybe just didn't word it right.. I was confused by this statement too at first In the future, performance probably will suffer greatly. Like using a GTX 1080 on a Pentium4 or something, that CPU will be the bottleneck of bottlenecks. But right now, PCI-e 2.0 isn't really a bottleneck.
I worked for one of the major GPU companies for a while.. don't think I've ever seen a PCI-e slot be a major bottleneck... the CPU will generally be the bottleneck long before that. It was the same thing back when PCI-e 1.0 was the standard and 2.0 cards started becoming popular. We'd have people asking all the time if it would work.. yes its backwards compatible.. no you probably won't see a major bottleneck, and if you do.. it's usually the CPU that can't keep up with the GPU Now with modern systems its usually the mechanical HDD that is the major bottleneck if that's your primary
However if I was building a system right now I would most certainly get a 3.0 mobo for future-proofing. Who knows how long before it starts to become a big issue. But games/software and hardware are becoming more and more powerful every day. So it's a good idea.
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#23703954 - 10/03/16 02:50 PM (7 years, 3 months ago) |
|
|
Dunno how this thread is still alive, but everything Shroomism has said is true. If you're experiencing a bottleneck, your biggest issue from what I recall of your hardware is that you have a first gen i7...... The second gen sandy was like an enormous 33% jump in performance.... Now sandy is showing its age a little, but it's actually not all that bad, just no longer in the top tier... AMD doesn't even have anything close to an Ivy Bridge processor, and that's around the time PCI-3.0 motherboards started hitting the market.... The reality is if you upgrade to modern day processor with any decent board and a mid range nvidia card you will have PCI-3.0.... Really the only future proofing as far as I'm concerned is being able to swap out a faster card if you own i5 or i7 rig right now. Get a nice PSU that can handle a power hungry card so in case you want to go for a dope ass graphics card some day down the line it's as easy as taking your old one out and popping your new one in.
Don't even think about 4k gaming, fuck, it takes a reasonable amount of juice just to get QHD (1440p) gaming running smooth as butter.... My GTX 680 can't do it on higher end games without usually knocking down a setting or two but still plays most games at 1080p smooth as butter with all the knobs turned up....... That's just the reality of pushing 60% more pixels to get to 1440p, pushing more pixels to get to 4k is just silly imho..... unless you're a seriously rich bitch, but even then I'd rather just have smooth as butter performance.... I will update when a game I want to devote endless hours to comes out... Right now I'm kind of looking at Obduction, made by the makers of Myst and Riven, looks like a pretty fun world to explore around with lots of eye candy, some oldschool puzzle solving.
Oh, and the expensive TVs you are likely thinking of are OLED, those fuckers pop like nothing I've ever seen before. I'm sure there's a 4k OLED TV by Samsung by now, but meh, I'd love to get my hands on a 1080p OLED TV without breaking the bank.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23705724 - 10/04/16 04:20 AM (7 years, 3 months ago) |
|
|
What do u mean "those fuckers pop like nothing"?
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23705831 - 10/04/16 06:28 AM (7 years, 3 months ago) |
|
|
This is the TV I like: http://www.trustedreviews.com/panasonic-tx-65dx902-review
http://www.panasonic.com/uk/consumer/viera-televisions/led/tx-65dx902b.html
As u can see it is not "OLED" TV but some thing called "honeycomb" back light technology ....
It might not be OLED but they say it's the best TV around
Of course I haven't seen it IRL yet but I'm going to drive to the nearest location to check it out.
--------------------
Edited by desant (10/04/16 06:35 AM)
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23705860 - 10/04/16 06:55 AM (7 years, 3 months ago) |
|
|
black quality, contrast, color accuracy, moving resolution (sharpness), off-axis performance, screen uniformity..... hdr/wcg performance, overall day (high ambient light) overall night (low ambient light)
I think LG probably makes the best TV on the market atm..... check out the LG OLED65G6P
pricey motherfucker...
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23705942 - 10/04/16 07:50 AM (7 years, 3 months ago) |
|
|
I had a 60' LG plasma TV 2 years ago. Out of all market leaders (Sony, Panasonic, Samsung, LG) LG I hear has the worst input lag. Also there are unconfirmed reports of "cancer bots" installed in them , and they have a history of even messing with your WiFi and shit
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23705971 - 10/04/16 08:05 AM (7 years, 3 months ago) |
|
|
I meant sheer picture quality, anyways every brand makes really high quality tvs and really poor tvs, it's all about the model..... some brands may have better customer support when it comes to warranties than others, that's something to factor in..... Input lag is going to be determined largely by the technology of the display, for example IPS displays generally have worse input lag than a TN display, but again the screens pop..... A competitive gamer should just get a TN display...... I'm sure LG has had a bad batch of IPS displays that maybe had some image persistence, perhaps that's what you're talking about with cancer blots..... that was fixed in a second revision or so on.... It's always a good reason not to be on the bleeding edge of tech, you pay to beta test somebody else's shit....... Second or third gen and the problems are addressed. But in terms of screen quality, you have to factor in LG & Samsung are making epic fuck tons of screens for laptops and monitors of all brands... It's not like Dell manufactures a screen with their Ultrasharps, they use LG panels... There's also just backlight bleed and IPS glow and shit to deal with that can look pretty bad, maybe that's what you meant by blot.... occasionally a cluster of dead pixels might get by. Sometimes you have to just complain and have them replace your display until you get a "good" one, it's really a luck of the draw kind of thing.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23706182 - 10/04/16 09:38 AM (7 years, 3 months ago) |
|
|
LG might be making computer monitors but than doesn't mean there TVs are up to scratch 
But I agree with ya that both brands have good TVs and bad ones
Me and my buddy here agree upon this:
Best TV brands
1. Panasonic 2. Sony* 3. Samsung 4. LG
Now like I mentioned dude at the store said Sony are the best BECAUSE all movie making equipment is done on Sony hardware - cameras, sounds, post processing and mastering.... it's all Sony, apparently, as logic says sony TVs will fit right in and outperform others
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23706242 - 10/04/16 09:57 AM (7 years, 3 months ago) |
|
|
I'm just talking about pure picture quality here, for watching films.... None of the Smart TV features (I mean get a HTPC or something) or Input Lag for gaming wasn't on my mind.... OLED TVs look fucking amazing, you just gotta go see one in person, especially if you can see the model I mentioned...... Anyways I pay attention to AVS forums and the Values Electronics TV Shootout are some serious fucking TV snobs so, I really think it's worth paying attention to, for three consecutive years OLED has taken the king of the crown....... I just mean look at one properly calibrated in person and you be the judge is all I'm saying. Panasonics were way better years ago when plasma was just bringing best contrast in particular, I mean they had nasty black levels that an LCD screen just couldn't perform with....
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23706253 - 10/04/16 10:01 AM (7 years, 3 months ago) |
|
|
Gonna go to the store tomorrow and see what OLED they got
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715232 - 10/07/16 06:16 AM (7 years, 3 months ago) |
|
|
Ohhhh boy
Just got back from the store. The only OLED they got is a LG for 4£ grand
The staff at the store told me that even tho they don't have my Panasonic flag ship model in stock they have a smaller Panasonic which THEY say is identical to mine.... and they say it ain't that great, but there not right, mine got better technology - HDR and honeycomb lighting which they don't know anything about, there just talking trash.
So yea, I can't find my Panasonic in any of the shops here. Which means either I should go blind and buy it online Or listen to the shop keepers and get a Sony which they call "the best"
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 *DELETED* [Re: desant]
#23715269 - 10/07/16 06:37 AM (7 years, 3 months ago) |
|
|
Post deleted by desantReason for deletion: K
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715282 - 10/07/16 06:44 AM (7 years, 3 months ago) |
|
|
I don't think best buy really calibrates their TVs, but what did you think of how the OLED tv actually looked? Either way it's more money than I'd want to spend on a TV atm... Anyways I'm not a twitch console gamer or anything so I can't make recommendations good looking screens with low input lag, that may be contradicting.... That may not actually be that bad for a big screen TV Obviously not a gaming monitor though....
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23715312 - 10/07/16 06:54 AM (7 years, 3 months ago) |
|
|
Quote:
CosmicJoke said: what did you think of how the OLED tv actually looked? Either way it's more money than I'd want to spend on a TVs
Yea it looks alright but like U said ain't worth so much money
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715357 - 10/07/16 07:10 AM (7 years, 3 months ago) |
|
|
I think I'm gonna go of the deep end and get my Panasonic
Guys in th store say it's shit but they don't know anything about it and don't even have it in stock!
--------------------
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715385 - 10/07/16 07:16 AM (7 years, 3 months ago) |
|
|
The guy at the store also said something about Panasonic and Firefox....I did a search and apparently Firefox is a history for TVs, donno what it means if I buy a TVs with Firefox
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715391 - 10/07/16 07:17 AM (7 years, 3 months ago) |
|
|
Last Panasonic I've seen in the Value Electronics TV Shoot Off
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23715427 - 10/07/16 07:32 AM (7 years, 3 months ago) |
|
|
That's the thing! There many different models! I got A Panasonic in th store! But it ain't the one I'm after! 
The one in store looks a bit shit, I must admit, I wouldn't buy it, but the one I'm after is top of the range!!! Ppl honestly say it's the best image quality u can get atm
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715551 - 10/07/16 08:17 AM (7 years, 3 months ago) |
|
|
Dunno about gaming, but am pretty sure all those TVs I just listed blow the normal TV out of the water in terms of watching films. The Panny is only a 6.8 because these people are serious TV snobs... Anyways, rather than asking the shroomery you can get way more info off of www.avsforum.com, and if you do podcasts Scott WIlkinson knows his shit, the show is called Home Theatre Geeks.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
desant
Pleiadian Revolutionary



Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23715605 - 10/07/16 08:39 AM (7 years, 3 months ago) |
|
|
Cool, I'll do that
--------------------
|
CosmicJoke
happy mutant


Registered: 04/05/00
Posts: 10,848
Loc: Portland, OR
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#23715618 - 10/07/16 08:47 AM (7 years, 3 months ago) |
|
|
Yah I'd suspect there's gonna be a little trade off getting the input lag you want and as vivid of a picture as you want, probably have to find a middle ground, but it will still be pretty sexy...... If you have a couple thousand dollars or so to throw around on a TV and have done some homework, you're probably not gonna regret it..... And one thing you got right, best buy never knows what they're talking about.
-------------------- Everything is better than it was the last time. I'm good. If we could look into each others hearts, and understand the unique challenges each of us faces, I think we would treat each other much more gently, with more love, patience, tolerance, and care. It takes a lot of courage to go out there and radiate your essence. I know you scared, you should ask us if we scared too. If you was there, and we just knew you cared too.
|
Shroomism
Space Travellin



Registered: 02/13/00
Posts: 66,015
Loc: 9th Dimension
|
Re: graphics card pcie 3.0 vs 2.0 [Re: CosmicJoke]
#23718906 - 10/08/16 10:03 AM (7 years, 3 months ago) |
|
|
Quote:
CosmicJoke said: best buy never knows what they're talking about.
Truer words have never been spoken
--------------------
|
desant
Pleiadian Revolutionary


Registered: 03/31/09
Posts: 7,038
Loc: Aether
Last seen: 6 years, 8 months
|
Re: graphics card pcie 3.0 vs 2.0 [Re: Shroomism]
#24072061 - 02/07/17 10:19 AM (6 years, 11 months ago) |
|
|
..
Edited by desant (02/08/17 03:19 AM)
|
Byrain

Registered: 01/07/10
Posts: 9,664
|
Re: graphics card pcie 3.0 vs 2.0 [Re: desant]
#24072261 - 02/07/17 11:58 AM (6 years, 11 months ago) |
|
|
Your gpu is like a blunt instrument, it wont do precise calculations nearly as well as a good cpu.
|
|