RivalGT

Member
Dec 13, 2017
6,480
The rumor the last few days was that the 4070 was priced at $750 and only had 8gb of ram. That honestly never sounded realistic to me.

If the performance is there, and has great power usage like other 4000 series cards, then this sounds great if you can actually get it at $600.

It won't be amazing value, it looks like nothing this generation will offer that.
 

Spoit

Member
Oct 28, 2017
4,122
Well, not every game supports DLSS and/or FSR and not only that but the quality of them can vary on the game. Like in some games it looks better than native and other times worse with issues like blurriness, flickering and such. I'm sure it also relies on the cpu more as well. I would say Quality mode is the only one worth using really for image quality and performance. Balanced can be okay in certain situations but not always worth it. The performance modes just look...ugh.

Those upscaling methods won't be a standard until nearly every game from now on supports both!
But you were talking about RT games? IIRC the only one that has RT without a reconstruction option (even if some of them are AMD sponsored and are FSR2 only) is Elden Ring? Which has it's own problems, even with a 4090

I really hope Nvidia sees the writing on the wall and with 5000 series they just focus on efficiency and bringing cost down. Like even if next gen isn't a massive leap in performance, I think it would be much more successful if they just focus on achieving around the same performance as 4000 series at a cheaper price.
I swear, people just refuse to let that 600W rumor go. Lovelaces is way more power efficient than RDNA 3, and drastically more power efficient than Ampere (which itself also chose a dumb point on the power curve for it's default, and could be moved a good 50-100W down with barely effecting the performance).

From that TPU article linked earlier:
watt-per-frame.png
 

Timu

Member
Oct 25, 2017
15,850
But you were talking about RT games? IIRC the only one that has RT without a reconstruction option (even if some of them are AMD sponsored and are FSR2 only) is Elden Ring? Which has it's own problems, even with a 4090
Both RT and non RT games. COD MW2 2022 has DLSS and FSR but no RT for example. But yeah RT games normally have DLSS and FSR.
 

Deleted member 93062

Account closed at user request
Banned
Mar 4, 2021
24,767
But you were talking about RT games? IIRC the only one that has RT without a reconstruction option (even if some of them are AMD sponsored and are FSR2 only) is Elden Ring? Which has it's own problems, even with a 4090


I swear, people just refuse to let that 600W rumor go. Lovelaces is way more power efficient than RDNA 3, and drastically more power efficient than Ampere (which itself also chose a dumb point on the power curve for it's default, and could be moved a good 50-100W down with barely effecting the performance).

From that TPU article linked earlier:
watt-per-frame.png
I wasn't making a comment on the efficiency of Ampere. I'm saying next generation instead of pushing to the limits, I hope they just focus on cutting power and achieving the same performance as 40 series while keeping prices lower.
 
Nov 8, 2017
13,373
Why did the low the bus from 256bit to 192?

Because smaller bus is cheaper and if they did 256 they would need to either deploy 16 or 8gb of vram. The 4070ti has the same 192 bit bus + 12gb vram.

In terms of real world performance, the dramatic cache increase between Ampere and Lovelace GPUs has meant performance saw good improvements beyond just the final memory bandwidth figures, particularly at lower resolutions.

The price:perf ratio based on empirical benchmarks is the true test, don't get too hung up on the paper specs.
 

Deleted member 7148

Oct 25, 2017
6,827
I was going to grab one of these to upgrade my 3070ti in order to bump up the VRAM but I'm going to wait for benchmarks first. I see no reason to buy a 4070ti after this.
 

dgrdsv

Member
Oct 25, 2017
12,143
Why did the low the bus from 256bit to 192?
Because they've upped the cache size about 10 times.
And 4070 will get 21Gbps G6X instead of 14Gbps G6 giving it more external VRAM bandwidth than 3070 had with its 256 bit bus in addition to a 10X increase in cache size.
People seem to constantly miss these parts when talking about bus widths.
 

Skyzar

Banned
Oct 27, 2017
1,539
Still not interested. Tired of these GPU as a service, upgrade every year trash configs from nvidia.

2.5x tflops of a 4090, 25x RT performance (where those 400+ RT cores at?? it's been years and we still aren't seeing the decent triple digit RT cores or anywhere near that and the RT performance shows in these super mild RT implementation games), 32GB+ VRAM with ultrafast and high bandwidth memory for $900 and I'll bite on a GPU that will last more than a year before struggling with the non-crossgen UE5 games with Lumen coming out at a half-decent resolution and framerate.

They're barely keeping up with the cross-gen games with shite RT now. And shove your DLSS. Imagine the pain when the real next-gen titles start to drop after going 20xx->30xx->40xx.

If hardware defect signing up for class-action.

They can get lowballed on their power hungry poorly configured mountainous stockpile of these chips by a Chinese AI firm. Get fucked until they put something out proper and consumer friendly at these prices for anything with a longer shelf-life than an opened can of tuna.
 
Last edited:

RobbRivers

Member
Jan 3, 2018
2,045
Because smaller bus is cheaper and if they did 256 they would need to either deploy 16 or 8gb of vram. The 4070ti has the same 192 bit bus + 12gb vram.

In terms of real world performance, the dramatic cache increase between Ampere and Lovelace GPUs has meant performance saw good improvements beyond just the final memory bandwidth figures, particularly at lower resolutions.

The price:perf ratio based on empirical benchmarks is the true test, don't get too hung up on the paper specs.

Because they've upped the cache size about 10 times.
And 4070 will get 21Gbps G6X instead of 14Gbps G6 giving it more external VRAM bandwidth than 3070 had with its 256 bit bus in addition to a 10X increase in cache size.
People seem to constantly miss these parts when talking about bus widths.

Thanks very interesting explanation. I was trying to convince myself that my "recently" purchased laptop with 3070 was not that bad in comparison to new 4070 equivalent 😅
 

Pharaoh

Unshakable Resolve
Member
Oct 27, 2017
2,692
What a age we live in that a 1600 bucks card looks like a great deal.
 

AgentStrange

Member
Oct 25, 2017
3,874
Sounding like it'd be better to get a 3080 for the extra VRAM. Was hoping my 3070 would remain relevant for two or so more years.
 

Ganondolf

Member
Jan 5, 2018
1,088
Is the specs graph in the OP the most up to date rumoured. I thought the 4070 was going to between 200-220 watts
 

vixolus

Prophet of Truth
Banned
Sep 22, 2020
56,820
Dang I haven't built a computer in yeaarss. Wouldn't even know how to do it now. What websites do people order their parts on? I should upgrade...
pcpartpicker like above is a fantastic resource of scoping out prices, compatibility, etc. I typically buy my components on Amazon > Best Buy > Newegg.

There's a Microcenter in Houston that's an hour or so away from my parent's that I will visit on occasion, but the deal has to be really worth it to deal with the drive + toll road + insane traffic at that store or if they have EVERYTHING in stock
 

Sparks

Senior Games Artist
Verified
Dec 10, 2018
2,898
Los Angeles
pcpartpicker like above is a fantastic resource of scoping out prices, compatibility, etc. I typically buy my components on Amazon > Best Buy > Newegg.

There's a Microcenter in Houston that's an hour or so away from my parent's that I will visit on occasion, but the deal has to be really worth it to deal with the drive + toll road + insane traffic at that store or if they have EVERYTHING in stock
PC Part Picker ( https://pcpartpicker.com/ ) is a great resource for building computers. It will track the parts list, check part compatibility, estimate the power usage, and tell you which retailer has the cheapest price.
Sweet, so it is still how it was. Thanks for the resources, time to start this process!
 

dgrdsv

Member
Oct 25, 2017
12,143
Legit how I feel lol... all the cards that have come out after the 4090, esp the 4080 and made me wonder why I don't just swallow the hard pill and get a 4090
4090 may look like a great deal but it is also the card which is the most likely to be followed up with a successor with the same performance costing about half of what 4090 does.
 

a916

Member
Oct 25, 2017
8,938
4090 may look like a great deal but it is also the card which is the most likely to be followed up with a successor with the same performance costing about half of what 4090 does.

Like a 4090ti or you mean the 5000 series?

That amount of VRAM and RT would be pretty nice for using in UE
 

a916

Member
Oct 25, 2017
8,938
5000 series
2080 Ti -> 3070
3090 -> 4070 Ti

A 4090 Ti would be a late gen uber expensive halo card like the 3090 Ti was

Yeah I think that's the math I'm trying to run in my head right now... I don't like upgrading my GPU often... and even though a 4090 is top end, it'll eventually be replaced...

Basically I need to figure out what the value of buying a 4090 now and think I'd be set for at least 4-5 years. But I've noticed a lot of people that do buy these top end cards also end up flipping them out for the next top end ones.
 

Conf

Member
Dec 4, 2021
545
Basically I need to figure out what the value of buying a 4090 now and think I'd be set for at least 4-5 years. But I've noticed a lot of people that do buy these top end cards also end up flipping them out for the next top end ones.

That's who these cards really are for, enthusiasts who get the best every gen.

You're usually better off spending less now and upgrading next gen (+selling your old card) than buying the highest end card with the intention to keep it 5 years.
(and by spending less I don't mean getting a $1200+ 4080, that's a bad deal)
 
Last edited:

Zeliard

Member
Jun 21, 2019
11,108
Sweet, so it is still how it was. Thanks for the resources, time to start this process!

Have fun. :)

It's very satisfying putting together a new PC for yourself (some current pricing notwithstanding…)

This thread is also a great resource so feel free to go ask any questions there:

www.resetera.com

The PC Builders Thread ("I Need a New PC") v3 PC - Tech - OT

I have been curious about getting a new monitor for the past few months but I’m deciding not to I can’t believe this monitor I got for $289 https://www.amazon.com/GIGABYTE-Monitor-Display-Response-FreeSync/dp/B089NJ24WZ?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&psc=1&smid=A3DG6X8R7E7914 has...
 

Theswweet

RPG Site
Verified
Oct 25, 2017
6,498
California
Like a 5070 at $600 for example. Probably won't have 24GBs of VRAM but will still perform similarly.

Isn't the 4070 specs slating it to be 20%~ slower than the 4070 ti, which already is anywhere between a 3080 12GB and a 3090 ti in performance? 4070 itself seems far more likely to be a little slower than a 3080 (though, not by much)

Going by that I don't know if I would expect the hypothetical 5070 to match a 4090.
 

dgrdsv

Member
Oct 25, 2017
12,143
Isn't the 4070 specs slating it to be 20%~ slower than the 4070 ti, which already is anywhere between a 3080 12GB and a 3090 ti in performance? 4070 itself seems far more likely to be a little slower than a 3080 (though, not by much)

Going by that I don't know if I would expect the hypothetical 5070 to match a 4090.
3080s and 3090s are pretty close to each other anyway.
 

Azai

Member
Jun 10, 2020
4,048
so seeing that pricing of the cards is based on performance is there even a chance of the 5000 series getting cheaper again?
like for this to be the case Nvidia would need to get rid of all its stock before releasing the 5000 series. and I dont see that happening at all.
 
Nov 8, 2017
13,373
so seeing that pricing of the cards is based on performance is there even a chance of the 5000 series getting cheaper again?
like for this to be the case Nvidia would need to get rid of all its stock before releasing the 5000 series. and I dont see that happening at all.

The pricing of future lines of GPUs will depend on:
  • How much they cost to manufacture
  • How much demand for them they expect
  • The price/perf ratio of the competition on the market (and in general, whether those cards are appealing to consumers).

Demand being lower today than in 2020/2021 is a good sign, but AMD has been doing quite poorly at selling their GPU products lately, and cost per silicon area has been climbing. Yes it's abberant historically for GPUs to be priced as badly as they have been since ~2020, but it's also abberant that consoles increased in cost, rather than decreasing.

I don't expect major cost reductions. There will be some wiggle room for prices to go down a little to boost sales over time (since demand has been impacted by high pricing), but I do not expect a $329 GTX 970 ever again from Nvidia. Even if AMD decided to get more aggressive on their margins, they couldn't afford to sell that class of GPU at such low prices.

Intel is the most desparate to gain marketshare on their lineup, and that's the most likely place to find bargains, but ofc they have their own problems with software / drivers, playing major catchup with the incumbents. They are outright loss leading (or "loss 3rd placing" perhaps might be a more accurate term) on the A750/770 to reach those prices.
 

JahIthBer

Member
Jan 27, 2018
10,430
so seeing that pricing of the cards is based on performance is there even a chance of the 5000 series getting cheaper again?
like for this to be the case Nvidia would need to get rid of all its stock before releasing the 5000 series. and I dont see that happening at all.
They will be cheaper, AMD is already cutting prices slowly that even diehard Nvidia users like myself maybe should have got a 7900 XTX instead of a 4070 Ti.
These mid range GPU's being so overpriced just isn't selling well and AMD is admitting it with their actions, Nvidia is going to be more stubborn though.
 

dgrdsv

Member
Oct 25, 2017
12,143
videocardz.com

NVIDIA GeForce RTX 4070 specifications confirmed by GPU-Z validation - VideoCardz.com

GeForce RTX 4070 confirmed with 5888 CUDA cores, 12GB GDDR6X memory The first RTX 4070 has now been validated with GPU-Z software. While the specs of RTX 4070 non-Ti are no secret to anyone at this point, we now have a proof of an actual custom RTX 4070 out in the wild being tested. As […]

videocardz.com

Gigabyte, Palit, Zotac and MSI GeForce RTX 4070 GPUs have been pictured - VideoCardz.com

It’s raining RTX 4070 More custom GeForce RTX 4070 graphics cards have now been spotted. We have several updates to our RTX 4070 coverage, finally showing more designs based on NVIDIA’s slowest Ada GPU yet. Pictures of Gigabyte’s full lineup have made their way to the public eye with help from...
 

Maple

Member
Oct 27, 2017
11,890
So....will the 4070 be about on par with a 3080 in terms of rasterization performance? Or slightly better?
 

brain_stew

Member
Oct 30, 2017
4,846
Starting to wonder if it is only the single 8 pin cards that will be 599.

Single 8 pin should be fine? 150w from the 8 pin + 75w from the PCIe connection. That still leaves room for a +10% power limit increase if they want to include it.

This is really low power draw card, it's good to see there'll be plenty dual fan designs available at launch. It's about time we got some more sensible GPU designs.
 

MrBob

Member
Oct 25, 2017
6,674
So....will the 4070 be about on par with a 3080 in terms of rasterization performance? Or slightly better?
My guess is it will be around the 3080 or sightly better for most games.
Single 8 pin should be fine? 150w from the 8 pin + 75w from the PCIe connection. That still leaves room for a +10% power limit increase if they want to include it.

This is really low power draw card, it's good to see there'll be plenty dual fan designs available at launch. It's about time we got some more sensible GPU designs.

Yeah I'm happy about the dual fan designs too. If I can get a dual fan 4070ti or 4080 at lower prices, I would rather have that. Preferably with regular 8 pin connectors too.

For the power I was thinking about above the default boost rate. I thought the 4070Ti could boost near 3000Mhz, but could be wrong. Since the 4070 is a cut down same chip, thought it might be power limited with one pin trying to boost higher.
 

brain_stew

Member
Oct 30, 2017
4,846
videocardz.com

NVIDIA GeForce RTX 4070 Founders Edition GPU has been pictured - VideoCardz.com

The third RTX 40 Founders Edition For the new launch, NVIDIA is indeed making their own design. The RTX 4070 Ti did not get a Founders Edition, in an effort to push more custom designs to the market, but it seems the non-Ti SKU will see a different approach. NVIDIA has sent out a Founders […]

I'm liking this launch more and more, a nice compact 2 slot founders edition card at MSRP is going to be a great option in the current market.