Raytracing? RTX/2080/deep learning support?!?
" what is source? in the link you posted, lol. thats where i got that quote from. also just watch some legitimate reviews and stuff that isn't sponsored by nvidia and you'll get your answer as to what these cards are. its all over youtube about how much of a sham these things are. https://www.youtube.com/channel/UC4Z8mPYjn6Dhr6n531YDh0Q/videos theres a good number of people talking about it other than this, but i don't have time to link everything about it. just get watching. jaytwocents just released a video explaining why ppl only get 30 fps in tomb raider with RTX on. he says the card is just fine, but its because RTX is on that the fps gets tanked. it takes a lot for that feature etc. but this is exactly the issue. why would you pay $1200 for a card that is featuring RTX and its completely built around it, when you would end up just turning the feature off for higher frames and/or higher resolutions? it is literal shit. i'd definitely never preorder these cards, and really wait to see how the market reacts to it and what benchmarks really end up doing. see if RTX ends up being developed, see what the true benchmarks are and in the meantime the price will drop etc. as to the article you're quoting, keep in mind a few things. #1 it says 50-57 fps (this is probably an average). this is still quite low because it was 1080p its being played in with ray tracing enabled. it doesn't say what the max or lowest frame rates were, but other sources do, to which its stated 30-70 fps....again, in 1080p. now the article goes on to say it gets 60 fps in 4k on other games. but keep in mind this is WITHOUT ray tracing on, its just the cards raw power. is this enough improvement over 1080 ti/titans etc to justify its price point? definitely not for me! lol. |
![]() |
" In that article is also stated from developers: "The Nvidia Ray Tracing technology currently being shown in Shadow of the Tomb Raider is an early work in progress version. As a result, different areas of the game have received different levels of polish while we work toward complete implementation of this new technology." Those 50-57 fps is not said if 4k or 1080p and its how its situated in article is assuming its 4k, nevertheless even if its in 1080 its still fantastic number. Ray tracing, with speed of how technology is growing, would be here no earlier than in ten years. New Tomb Rider games are technology hungry and even on 1080 gpu on 1080p is 50 fps fairly normal. In 4k on 1080ti 50 fps is average. Lets see how it will be in the end with Tomb Rider, but I have no doubt this technology is a future. Those who like technology and have pc gaming as hobby, money are not really issue. Its same with every hobby .). ps: btw you have spent a lot of money on this game, you have money no doubt, so why so salty about new technology and price tag creators asking for it? Dernière édition par Rexeos#3429, le 23 août 2018 à 14:34:33
|
![]() |
Honestly, paying a premium to beta test mostly unused hardware is never a good value. If the tech is good enough the second gen out will refine it for cheaper and there will be more product that actually use it you might be interested in... if it isn't and the tech dries up, you dodged the bullet.
The idea behind raytrace and the tech demoing they showed is promising but time will tell. |
![]() |
Wait for reviews.
That said, the only important aspect of a PoE machine I've found is a highly clocked Intel CPU and an SSD. Nothing else really matters. The game engine can pump out some insane FPS numbers, but the server cannot keep up so there's almost no point in pushing for the very high-end GPUs. A single 1080Ti can sometimes get over 900fps in 4K resolution (DX11, multithreading, everything maxed), and sometimes slows down to sub-20 fps in intense situations. When that happens, neither CPU or GPU is at 100% load, but rather the server cannot keep up so it throttles the game. At least that's what I'm seeing. 8700K w/ 1080Ti and a 4K screen, average 32ms ping. If I cap my fps at 60, neither CPU or GPU is stressed that much and the game still dips below 60 now and then. So: a 2080 or 2080Ti is probably going to be lots of fun elsewhere, but not much use for PoE. | |
" ya in that article specifically it doesn't state if the 50-57 fps is 4k or not, but its assumed it is 1080p based off every other review. this is what is fishy about the whole damn cycle about it. its like they're lying by omission. they want you to assume its 4k 50-57 fps, when, again, by other reviews its clear it CANNOT be. i actually highly doubt RTX is the future. the differences are fairly minor, and as soon as anyone dips below 60 fps the first thing you do is turn off shadows because its always the most demanding feature, and RTX essentially just makes it so your shadows and similar things are slightly more pretty. " i've spent the amount of money i have on this game over the course of 5 years, not in one lump sum. so thats factor #1. it isn't that i have money, its that i make money. i don't mind spending it on PoE because its always one of the one-two games im ever playing at one time. i don't have time or money to buy every new AAA title and beat it, etc. the last newest game i've purchased was dark souls 3 and i still only have i think two playthroughs on it. im not salty about the price tag. im not buying the card nor would i if it was only $200. but for what it offers to what its priced at is pretty ridiculous right now. but who am i to judge, they can price it at whatever they want and as long as people buy it i guess it isn't ridiculous. it honestly doesn't matter to me, but i'd prefer people not get sucked in and suckered by Nvidia's shady claims, dealings, and price points right now. it seems like im not alone in that as many others are warning about it and almost everything i've watched/read says for noone to preorder these things. yet they're still sold out everywhere. it kinda seems like nvidia could market a dead cat able to run your 900' 14k TV at 400 fps, charge $50,000 and slap your mother and people would still jump at the chance. Dernière édition par xMustard#3403, le 23 août 2018 à 16:29:44
|
![]() |
summing it up so far
POE engine old- not likely raytracing is priority Shady benchmarks. What rez/what framerate ?drivers ?game optimized ?etc High adaptation fee. wildcard- will this rendering AI be a flop or will it be astounding. There is no way to predict the effect of releasing a million AI working in unity (or not?) to improve game rendering across the world. 1080ti FTW thus far. |
![]() |
RTX and DLSS are both Proprietary Techs by NVIDIA.
Not only they are new and unproven, you need to spend $$$ on development/port of your game engine to Direct X 12 / DXR (for raytracing) and whatever Nvidia's code is for DLSS. That would also mean that AMD/Intel/Non RTX cards users (which is the majority) won't be benefiting of the new technology you just spent $$$ on. Considering GGG just spent $$$ to port their engine to DX11 and consoles and are actively updating it with new graphical features, I don't see them porting it so soon to DXR so that the few "Enthusiats" who bought in so quick in Nvidia's proprietary exclusive techs can see better shadows. P.S. a 2080 might be 50% faster than a 1080 in Rasterization, but a 2080 won't be more than 15% faster than a 1080ti Dernière édition par Jeffdee#3037, le 24 août 2018 à 03:35:46
|
![]() |
jeez PPL believing all that NVidia PR bullshit make me facepalm so hard. Ye sure 100% or even 200% of raw power gain :D sure from 16nm to 12nm architecture, that makes sense right. I also wonder how would game like PoE benefit from all that new "cool" stuff. Its meant for some fake ass FPS shooters with not yet prooven benefit (if any).
https://youtu.be/NeCPvQcKr5Q GGG Patch Notes: "Fixed an unwanted interaction where players had fun playing the game" Dernière édition par coyd#6278, le 24 août 2018 à 06:08:25
| |
This would mean spending ressources on 1% of the game's population. Not worth it.
GGG banning all political discussion shortly after getting acquired by China is a weird coincidence.
|
![]() |
Blows my mind that one would even think of needing such a thing just to run this outdated game.
|
![]() |