• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

tapedeck

Member
Oct 28, 2017
7,974
There is plenty stopping them. They haven't designed the system around higher clocks. Their power and thermal control is all done around their set clock speeds. They can't just raise those speeds on a whim. The entire reason Sony is using this unusual variable frequency method and expensive cooling is to allow such speeds to work safely.
You do realize MS upped the CPU and GPU clocks a bit 'last minute' before Xbox One launch right?
 

ShapeGSX

Member
Nov 13, 2017
5,206
There is plenty stopping them. They haven't designed the system around higher clocks. Their power and thermal control is all done around their set clock speeds. They can't just raise those speeds on a whim. The entire reason Sony is using this unusual variable frequency method and expensive cooling is to allow such speeds to work safely.

They certainly have thermal headroom. MS is using an expensive cooling system too.

They raised the Xbox One's frequency a few months before launch, which had a standard heat sink.
 

T0kenAussie

Member
Jan 15, 2020
5,075
Lets time travel back to 2013

MS announce Xbox One, the multitasking console to rule the living room. Price? 500 dollars bundled with a Kinect (that nobody wants). Features: No used games, always online, if offline for more than 24h it will become a brick. "We have a product for people who aren't able to get some form of connectivity, it's called Xbox 360"

Is it powerful?:
- It has secret sauce; it has DDR3 memory because GDDR6 was "uncomfortable"; "have you seen Titanfall?"; crushed blacks + upscalling "You realize you will see every game in 1080p as your output right?"; "We created DirectX"; Power of the Cloud;

Sony annouce PS4, it just play games. Price: 400 dollars + nothing. Features: 100 dollars less than Xbox One.

if you ignore context of course we have a bunch of hypocrites here, haven't we?
But if you think just a little and remember all the other things surrounding the Xbox One launch it might make sense that people talked about the 40% difference in TF.
You can't make a product that is more expensive with less power and expect that people won't complain about it.
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!

you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way
 

mordecaii83

Avenger
Oct 28, 2017
6,852
They certainly have thermal headroom. MS is using an expensive cooling system too.

They raised the Xbox One's frequency a few months before launch, which had a standard heat sink.
We have no idea if they have thermal headroom, no one but MS knows. I'd be shocked if they upped their clocks at this point, they have the higher numbers and all raising the clocks will do is reduce chip yields and increase fan noise on the console.
 

Railgun

Member
Oct 27, 2017
3,148
Australia
I wish the X had variable clocks as well just because every game should be using dynamic resolution anyway that would scale with the clocks and means we could push even a little further at times. In games with fixed resolutions it would be terrible but I hope we never see fixed resolutions again.
 

bsigg

Member
Oct 25, 2017
22,536
Yeah this is what I keep thinking..like there's really not much to stop MS from upping the clocks a bit before launch considering how 'wide and slow' the architecture appears to be. Not saying they will..but it's certainly plausible if they feel like it.

The largest issue that's easiest to see is how well the box currently handles thermals with the current clocks and does say bumping the GPU to 2GHz and the CPU to 4.0 GHz/3.8 GHz affect the console's ability to handle the additional heat.

I wish the X had variable clocks as well just because every game should be using dynamic resolution anyway that would scale with the clocks and means we could push even a little further at times. In games with fixed resolutions it would be terrible but I hope we never see fixed resolutions again.

Having variable clocks means devs need to develop around where they want the system sending power (CPU vs GPU) and trying to maximize that.

Having a locked, sustained clock, devs have access to the highest raw output the box has without having to juggle the power shared between the CPU and GPU.

The question that will come up is where are devs deciding to cut back on CPU power on the PS5 to push more image quality or where are we seeing the image quality drop because power is being pulled away from the GPU and sent to the CPU to handle more load there.
 

LiquidSolid

Member
Oct 26, 2017
4,731
User warned: modwhining
Legit surprised we haven't seen a guideline or even a staff post yet, tbh.
Are they even trying to rein it in? GAF had its problems, particularly the shithead who owned it, but FUD like the article in the OP was shut down pretty quickly. It's embarrassing this thread has lasted over 9 hours and 11 pages on Era.
 

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
You do realize MS upped the CPU and GPU clocks a bit 'last minute' before Xbox One launch right?

They did, but only by a very small amount. The GPU clock was increased only 6% for example. In comparison the PS5's GPU clock is 22% faster than the XSX's. So whilst they may be able to change it late in the game, I'm not sure by how much they could.

Side note, I'm definitely curious to see PS5's cooling system. Cerny straight up acknowledged they've not always been good with those things, and then also stated he thinks people will be really pleased with what their engineers have come up with this time, which coupled with the super high clocks, makes me even more interested to see what they have designed. I just hope it isn't too massive lol.
 

convo

Member
Oct 25, 2017
7,364
It's going to be all cross-gen games, anyway. Neither console should have any issues running current-gen games at 4K60.
And even if it's about third party concerns, they would develop for the lowest specs as many here say, which could be the regular xbone gen for a year or two. Now are people saying third parties will base their games on the lowest common denominator or the higher specs? PS5 got like double IO and Xbox got 15% in whatever TF is gonna do. I'd want a poll to what people think.
 
Dec 4, 2017
11,481
Brazil
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!

you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way
If it bothers you so much, I would recommend putting these people on the ignore list.
 

Nameless

Member
Oct 25, 2017
15,326
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!

you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way

MS has had the TF lead for the last 2+ years and no one was doing this, why? Because the TF difference was significant just like it was in favor of the PS4 at launch. A ~16% gap, however, is not only considerably less significant, it will be even less impactful due to diminishing returns. So why exactly should we pretend the difference in teraflops holds the same weight when the math (and common sense) says it should not?
 

VinFTW

Member
Oct 25, 2017
6,470
NXGamer and Digital Foundry have already covered this in far more detail.

Essentially, it's only misleading if you believe Cerny is lying about the system spending the majority of its time at max potential speeds. NXGamer does a good job of explaining why what Cerny said makes complete sense and appears most probable, hence the reason it really is 10.28 Tflop. Not to mention how only a very tiny frequency clock drop (2%) would be needed to claw back a lot of power, in the rare worst case power load scenarios.

I suggest you watch the video as it shows how whilst running games, the CPU/GPU are already downclocking based on usage on a frame by frame basis, hence rarely both max out per se anyway.


So why call it variable if it's really 10.28 tflop machine?
 

HeavenlyE

Member
Oct 27, 2017
3,800
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!

you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way
It looks like a lot of people are really wanting this to be a reverse of 2013 even when it shares no resemblance just so they can have their revenge for how their favorite console was treated
 

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
So why call it variable if it's really 10.28 tflop machine?

Probably to cover their asses lol. At the end of the day, there's going to be those rare times when it isn't a 10.28 Tflop GPU, and might be 10.1 Tflops or whatever instead. However brief, that'd still be misadvertising if they didn't mention it, so this way they're at least covered and being honest with consumers.

Both Microsoft and Sony have been pretty careful with verbiage lately so as to limit their legal liability, evident even with this whole BC stuff. They're using terms like Almost all. Majority of etc. They're not taking chances.
 

AllChan7

Tries to be a positive role model
Member
Apr 30, 2019
3,670
The crazy thing about all this is no one is saying that the SX is inferior. Yes its the superior console in most areas. MS has done a great job marketing it and showing it off.

However, the hyperbole and downplay of the PS5 specs has been exhausting asf since Tuesday. You have devs, DF and NXGamer all praising the console for being smartly designed yet all it takes is one article from people who aren't devs to convince some of you Mark Cerny has lied or is being purposely vague.

I'd take the opinion of devs over this article any day. The sooner this forum can accept that next gen will be incredible either way on both consoles the better.
 

Railgun

Member
Oct 27, 2017
3,148
Australia
The largest issue that's easiest to see is how well the box currently handles thermals with the current clocks and does say bumping the GPU to 2GHz and the CPU to 4.0 GHz/3.8 GHz affect the console's ability to handle the additional heat.



Having variable clocks means devs need to develop around where they want the system sending power (CPU vs GPU) and trying to maximize that.

Having a locked, sustained clock, devs have access to the highest raw output the box has without having to juggle the power shared between the CPU and GPU.

The question that will come up is where are devs deciding to cut back on CPU power on the PS5 to push more image quality or where are we seeing the image quality drop because power is being pulled away from the GPU and sent to the CPU to handle more load there.
I was under the impression the variable clocks were more like turbo on Intel CPU's, is it not that?
 

ShapeGSX

Member
Nov 13, 2017
5,206
They did, but only by a very small amount. The GPU clock was increased only 6% for example. In comparison the PS5's GPU clock is 22% faster than the XSX's. So whilst they may be able to change it late in the game, I'm not sure by how much they could.

Since the Xbox series X has far more CUs than the PS5, they get a lot more bang for the buck for a small frequency increase.
 

zombiejames

Member
Oct 25, 2017
11,912
I was under the impression the variable clocks were more like turbo on Intel CPU's, is it not that?
No, Cerny was very specific to point out their variable frequencies work nothing like how they do on PCs and smartphones (where frequencies drop when things gets too hot). He was very clear to say that wouldn't work because then you'd run into situations where a game might perform worse just because someone was playing in a hot room.
 

Gay Bowser

Member
Oct 30, 2017
17,567
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!

is this the new, slightly subtler version of "fanbois sucking ____'s dick?"

let's not.
 

Tabs2002

Member
Feb 1, 2018
1,514
Sony have so much mindshare. I don't think anybody would be claiming the series X was well thought out and smartly designed if the roles were reversed.
 

Cloud-Strife

Alt-Account
Banned
Sep 27, 2019
3,140
The crazy thing about all this is no one is saying that the SX is inferior. Yes its the superior console in most areas. MS has done a great job marketing it and showing it off.

However, the hyperbole and downplay of the PS5 specs has been exhausting asf since Tuesday. You have devs, DF and NXGamer all praising the console for being smartly designed yet all it takes is one article from people who aren't devs to convince some of you Mark Cerny has lied or is being purposely vague.

I'd take the opinion of devs over this article any day. The sooner this forum can accept that next gen will be incredible either way on both consoles the better.

/thread
 

ShapeGSX

Member
Nov 13, 2017
5,206
No, Cerny was very specific to point out their variable frequencies work nothing like how they do on PCs and smartphones (where frequencies drop when things gets too hot). He was very clear to say that wouldn't work because then you'd run into situations where a game might perform worse just because someone was playing in a hot room.

He's splitting hairs. As power increases, so does the temperature. So if they lower a frequency to stay within a power envelope, they're also lowering frequency as temperatures increase.

Modern CPUs and GPUs have thermal sensing diodes all over them. Hundreds in some cases. They won't be ignored. You can bet that they are all being monitored. It's just that they feel their cooling system has enough headroom to run without the extra heat affecting their programmed frequency curves for a given power load. But I'm certain you could abuse the chip enough with a hot environment to trigger a panic signal to slow it down. They'd be stupid not to.
 

Crayon

Member
Oct 26, 2017
15,580
I can hardly understand this power management thing and i still know more than this author.
 

Dekim

Member
Oct 28, 2017
4,297
MS made it a pont that the XSX will run quiet. Why would they up the clocks now and risk higher fan noise when they already beaten Sony in the numbers game? MS should be feeling happy where the XSX is on a technical level against the PS5. They already won the PR war in terms of which console is the most powerful, given the reactions since Wednesday..
 

gremlinz1982

Member
Aug 11, 2018
5,331
In my post I specifically compare a 5700 that is overclocked to 2,194 MHz, which is even closer to the PS5's clock and shows a correlating improvement in performance.

Just looks like the 5700 responds much better to overclocking than the 5700 XT does, perhaps that's due to the XT hitting power limitations, but as in your original post you only mentioned 5700, that's the one I looked into.

But that is interesting nonetheless. I guess now it's a case of seeing how RDNA2 responds to OC'ing, but presumably Sony is happy enough with the results that they did it.
The only way of knowing how a GPU performs is by actually running games on it. This has always been the case.
 

DukeBlueBall

Banned
Oct 27, 2017
9,059
Seattle, WA
On the case of Xsx small upclock. I don't think it's happening.

It happened for Xb1 for a few reasons. As I recall, Albert Penello said it involved some luck.

A. They were extremely conservative with thermals after RROD, so they made the cooling more robust than needed.
B. They desperately needed to close the gap as much as they can against PS4.
C. The esram was on chip and benefited too from an overclock, so the bandwidth of the entire system, especially graphics intensive, was also increased.

The 12.155TF figure is probably already the max that 560GB/s can comfortably support. Any upclock would be useless. They've already won the PR flop battle

Conversely I think PS5's GPU at 2.23ghz is very bandwidth starved, especially on the render backend. Moving to 14gbps chips from 16gbps chips shown in the Github tests was a big sacrifice for price.
 

2PiR

alt account
Banned
Aug 28, 2019
978
Man why can't we all be just happy with both systems. Why you all getting so worked up over this thing. PS4 with all its glory will not play Halo and SX with its beasty TF won't play ND games. Lets just hope both systems are no more than $500
 

Musouka

Member
Dec 31, 2017
505
It seems that it is the hot take season. Things from the truth about variable clock to the SSD being a gimmick are all over my feed.

Look, I have my doubts as anyone should, but I am equally excited to see new things being tried out. I will reserve my judgement until we see the systems in action. Otherwise, people are using their prior experiences with PC to jump to conclusions about this or that. I don't think that would give a clear picture since the design philosophies are different with the PS5.

I just wish Sony had followed up the technical presentation with a video teardown of the console for us normal people. I was quite excited when Cerny mentioned that we will be thrilled to see the cooling solution when the engineers reveals it in their teardown and thought we will get to see that directly after. I wonder if they haven't yet finalized the design of the console or if they just want to spread out their announcements.
 

mordecaii83

Avenger
Oct 28, 2017
6,852
Conversely I think PS5's GPU at 2.23ghz is very bandwidth starved, especially on the render backend. Moving to 14gbps chips from 16gbps chips shown in the Github tests was a big sacrifice for price.
I think this is conjecture until we see how 3rd party games are running on both. PS5's GPU cache scrubbers and coherency engines, combined with the custom I/O and any potential RDNA2 architecture improvements may paint a different picture.
 

Jaypah

Member
Oct 27, 2017
2,866
Man why can't we all be just happy with both systems. Why you all getting so worked up over this thing. PS4 with all its glory will not play Halo and SX with its beasty TF won't play ND games. Lets just hope both systems are no more than $500

Because fanboys? So you get "concern" about aspects of the consoles from people who don't really care outside of being able to shout that their plastic box is better in some way. To the point where you'd think the PS5 doesn't even have a GPU and the XSX doesn't have an SSD. It's stupid and tiring but unfortunately expected.
 

zombiejames

Member
Oct 25, 2017
11,912
He's splitting hairs. As power increases, so does the temperature. So if they lower a frequency to stay within a power envelope, they're also lowering frequency as temperatures increase.

Modern CPUs and GPUs have thermal sensing diodes all over them. Hundreds in some cases. They won't be ignored. You can bet that they are all being monitored. It's just that they feel their cooling system has enough headroom to run without the extra heat affecting their programmed frequency curves for a given power load. But I'm certain you could abuse the chip enough with a hot environment to trigger a panic signal to slow it down. They'd be stupid not to.

Here's a transcript of what Cerny said:

"The simplest approach would be to look at the actual temperature of the silicone die and throttle the frequency on that basis, but that won't work. It fails to create a consistent Playstation 5 experience. It wouldn't do to run a console slower simply because it was in a hot room. So rather than look at the actual temperature of the silicone die, we look at the activities that the GPU and CPU are performing and set the frequencies on that basis, which makes everything deterministic and repeatable."
 

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
Why would you ignore the CU count? It's not like it's insignificant by any stretch.

My post was in response to whether Microsoft could increase their clock frequency late in the design stages as they've done in the past, to try and get closer to the PS5's clockspeed. Hence CU's was never mentioned.
 

ShapeGSX

Member
Nov 13, 2017
5,206
Here's a transcript of what Cerny said:

"The simplest approach would be to look at the actual temperature of the silicone die and throttle the frequency on that basis, but that won't work. It fails to create a consistent Playstation 5 experience. It wouldn't do to run a console slower simply because it was in a hot room. So rather than look at the actual temperature of the silicone die, we look at the activities that the GPU and CPU are performing and set the frequencies on that basis, which makes everything deterministic and repeatable."

I know the theory. I understand what they are doing. And that's the only solution they could arrive at to get a semi-predictable system. I even have an inkling of how they do it using existing systems, based on my experience. I also stand by what I said that they are monitoring temperatures for unexpected situations. :-)
 
Jun 18, 2018
1,100
Ah yes, WWCFTech, the bullshit rumour site speaking as if they have any semblance of reality.

Titles runnings at 60fps aren't pushing things to run at a sustained 16.67ms every frame. They're usually at running at around 10-15% lower so that any jumps are still with the frame budget for when spikes do occur.

The problem with the PS5 specs is that it's not clear if the given CPU and GPU clocks are a joint sustained reality or if either have to cut back when the other has hit maximum clocks for a while.

It's been said before, but Sony's speech was lacking on detail so we have to wait for real world performance comparisons on cross-platform titles before we can form any conclusion on the general performance delta between platforms.
 

ShapeGSX

Member
Nov 13, 2017
5,206
My post was in response to whether Microsoft could increase their clock frequency late in the design stages as they've done in the past, to try and get closer to the PS5's clockspeed. Hence CU's was never mentioned.

There's very little reason for them to attempt getting close to the PS5's clock speed with as many CUs as they have. But a small clock increase? I suspect they have that headroom.
 

Dizastah

Member
Oct 25, 2017
6,124
I'm not a game programmer, but I do develop GPU accelerated scientific models, so I have a few years experience in optimizing different gpu architectures to accelerate computation. So I have sense of what makes something fast or slow as I've seen it play out on various hardware.

The notion of a 'sustained TF rate' doesn't really make sense. The TF number is a peak theoretical number that will in all likelyhood never be actually hit on either console(possibly occassionally for a few milliseconds at a time, but generally they'll not be computing at that rate). It's not a benchmark, it's just a way of counting components and clocks within a single number. It's actually a lot like a the way a business considers the number of 'man hours' it'll take to perform a task. The actual computational throughput will be determined by things like thread occupancy and how much shared and local memory each thread requires. Since both components use the same architecture, this in theory effects both equally, but it's not nonsense to suggest the higher clockrate gives a bit of help to the PS5 here as it's able to utilize local data (the data sitting in the GPU cache) and shuffle it out for the next piece of data that it needs more readily. It is an effective bandwidth increase on the ram->cache->computation pipeline.

More technical version:
The biggest factor on speeding up a GPU is how much you can saturate the compute units (those CUs that we keep hearing about). If you can get a concurrent thread on each ALU within each CU, and have minimal or no requirement to reach back to VRAM within a kernel call to swap data in and out of the CU cache, then you can get pretty close to your peak throughput. This is, in practice, not common, as the available local storage within the CU is tiny, a few tens of KB shared between all the ALUs. For example, in the Nvidia volta architecutre (which i'm most familiar with), there is a single 256KB block of memory (arranged in 32bit registers) for every thread running on that SM to use for data exclusive to that thread. In a perfect world, every one of the 64 CUDA cores in a single Volta SM would have it's own thread, meaning each one gets ~4KB per thread to store useful data. (There is 96KB of shared memory as well, but I'll ignore this for the moment. It's extremely useful but somewhat immaterial for this explanation). This is not generally practical, in my experience, so you're left with two options, not mutually exclusive. You can reduce the number of concurrent threads, or you can periodically swap data in and out of registers by calling back to VRAM. The former is what Cerny was alluding to when he said it's hard to fill more CUs than fewer, although I somewhat disagree with his characterization in the case where all CUs in both platforms have access to the same relative register and shared memory. I'm not familiar with RDNA2 though so I don't want to comment on that too much. In the latter case, which is almost always necessary to some degree, you can think of an analogy to screen tearing as to what happens under the hood.

Several threads are going about their business, making computations, thread a says 'oh, i need something from vram'. Thread a then gets paused, and thread b gets moved into his place to keep computing while the data for thread a is fetched. Meanwhile, thread b finishes his work, and thread a either is ready to keep going or isnt. This is determined by the latency in access to vram. Now if thread a isn't ready, most likely another thread gets moved in to thread b's place and keeps going. If thread a is ready, then it'll get shuffled back in to pick up the computation where it left off with its new data. The analogy to screen tearing is this. If you have ever played with v-sync on a 60hz monitor, and then on a 144hz monitor, you've probably noticed that screen tearing is far less noticeable than on a slower refresh. This is because the gap between when getting data and being able to use it is smaller. A similar analogy holds with clock speeds in a GPU. A faster clock speed will generally lead to less 'down time' in any given ALU as it is more likely to be ready to go sooner when the requisite data is available.

What I want to point out, is that NONE of this shows up in a TF metric. The underlying reality of swapping data in and out, the various bottlenecks, tradeoffs, etc, all of that is presumed to essentially not exist when discussing TF. However this is one of the biggest considerations when doing optimization, as you have to take into account these facts of life about having threads 'stall', so to speak.

Will this make the PS5 faster in computations than the XSX? In a few cases, possibly, but in general, no it won't. However it does mean that the story on what that gap is isn't as simple as many here are claiming. I expect that the PS5 may generally run at a slightly lower resolution (some quick calculations would put the resolution of 3504x1971 at a hair more than 16% reduction in pixel count), but in many cases I think the gap will be closer than you'd expect from a raw TF count, because the higher clock speed does help 'in the real world' in a somewhat non-linear fashion as compared to raw TF numbers in that it makes the penalty of moving data in and out of local storage smaller. It's not a HUGE difference (at least not in most cases), but it's not nothing.
Really good post. Thanks for this.
 

zswordsman

Member
Nov 5, 2017
1,771
6 more months of these articles, folks. Hang in there. I assume neither console will really disappoint, and as always, people will go where the games they want to play are, regardless of power differences realized or perceived.
6 months? Aren't we optimistic.
I feel like we're gonna get this BS argument for the entire gen even if PS5 games end up being amazing, or on par with SX exclusives. It'll never end, which is pretty sad.
 
Oct 26, 2017
6,151
United Kingdom
The parts of a GPU that consume the most power is the shader array. Very few games are actually shader-bound, and actually, counter-intuitively, densely packed scenes with lots of sub-pixel sized polygons actually saturate the shader array less than scenes with much larger geometry and a lots less stuff going on; so in those cases a slight downclock will have minimal impact on GPU performance overall.
 
Status
Not open for further replies.