I have previously done tests of spell crit reduction for poisons and reported them here. I have observed poison crits with as low as 2.13% crit. The conclusion I made in that post is based on the assumption that spell crit depression was fixed. However, if spell crit reduction is lower at lower levels, it is possible that is starts off at around 2.1% and levels off at 3% at higher levels of crit.
In a separate testing I did almost 34k swings, however since I was testing mongoose and berserking ppm, I separated the results into subsets with and without mongoose up. The sample with mongoose had 8163 observations and 3563 crits, so 43.6% crit rate, while my theoretical crit with mongoose up was supposed to be 49.1%. Thus in those 8163 observations, crit depression was 5.46% (with 95% confidence interval of 4.96% to 5.95%). While mongoose was not up, crit depression was 4.92% on average with (4.66%;5.18%) confidence interval. I am inclined to think that instead of a fixed rate of crit depression, there is instead an increasing rate of crit depression, such as a fixed depression + a percentage of excess crit rate. For instance, for physical crits, the formula could be 3% base depression + 5% of (crit rate - 3%). For spell crit depression, it could be 2% + 5%*(crit rate - 2%).
By your data, seems that crit rating is affected by diminishing returns much like dodge or parry, and that the amount shown on the character panel is the value before the diminishing return is applied.
Could be that the innate 5% hit which can never be taken away on white hits can never crit?
Like a no-crit automatic hit which stays above the pve combat table ?
Sorry if I am stating something obvious, but until now I did not realize the implications for white crit cap. There is a big difference for the effectiveness of crit rating and agility whether you view it as a crit depression or hit inflation/conversion. Consider the following notation: M = miss rate, D = dodge rate, G = glance rate, C = crit rate, H= hit rate.
Before this finding, we normally calculated the white crit cap as 100-D-M-G+4.8%, so it was worthwhile to increase your crit rate up to at least this number. The idea was that if your crit rate equaled to 100-D-M-G, it was then reduced when fighting a boss to 100-D-M-G-4.8% (old crit depression theory), so you observed H=100-(100-D-M-G-4.8%)-D-M-G=4.8% hits. Increasing crit another 4.8% to 100-D-M-G+4.8% we thought would push the hits out of the table. I think Hellords finding basically disproves that theory.
Instead I think there is a "reserved" 5% hits converted from crits (it's the extra hit Hellord observed, which is also the same as what we thought to be crit depression). If that conversion theory is true, then the white crit cap is actually much lower. Now D+M+G+5% of the hit table is "reserved" for dodges, misses, glances and converted hits. So what's left for crit is 100-D-M-G-5%, going beyond that does nothing for your autoattack crits. That basically means that the new crit cap is about 10% (!) lower than previously thought.
What am I missing then, I mean which 5% am I double counting?
- Was the crit cap not 100-D-M-G+5% under the crit depression hypothesis before the existence of Hellord's tests? Maybe we need numbers. Let's say D = 5%, M = 15%; we know G = 24, so 56% is left for crit and hit. If my crit rate with all buffs and debuffs is 61%, only 56% is used against bosses under the crit depression hypothesis. So I am at maximum of my crit, i.e. the crit cap is now 61%. Adding anything else is a waste for autoattacks.
- Is the crit cap not 100-D-M-G-5% now when we know 5% of crits are converted into hits? Using same numbers, 5% are dodges, 15% are misses, 24% are glances, and 5% are forced hits. Thus we have 100%-5%-15%-24%-5% = 51% left for crit and hit. So now adding any crit beyond 51% is not adding anything to my autoattack crits, so the new crit cap is 51%. Thus it seems to me there is a 10% difference between what we believed before and what we believe now.
I think when you say the difference in crit cap is only 5%, you are assuming there is both crit-to-hit conversion AND there is still a crit depression against bosses. While that may be true, I am advancing an alternative hypothesis that there never was a crit depression against bosses in first place, but instead there was always a conversion of 5% crits into hits when fighting against bosses. When we tested the crit depression before, we simply observed 5% less crits (and 5% more hits). That observation is consistent with both crit depression and hit conversion.
So in short, if the difference is not 10%, there should be some flawed logic in my calculation of crit caps before and after. Which one is it?
Before, the computation was the solution to the equation
(Crit-4.8) + Dodge + Miss + Glance = 100, so crit cap was 100-D-M-G+4.8, in your notation.
Under this theory, the equation is (Crit-4.8) + Dodge + Miss + Glance = 100 - 4.8 - that is, our crit is still reduced by 4.8, and we can't go up to 100% crit, only to 100-4.8 due to the forced misses. Rearranging to your notation, we get that the crit cap is simply 100-D-M-G, for a difference of 4.8. To look at it another way: when your "tooltip" crit, prior to crit depression, gets to 100-D-M-G, that indicates that all attacks, neglecting crit depression, should be Crits, dodges, misses, or glances; because of crit depression, 4.8% of attacks would instead be hits, but crit is just as capped.
That said: I'd hasten to point out that this is merely a theory right now - I don't think any comprehensive testing has been done to demonstrate that that's actually how it works. I just don't feel like we understand the problem well enough to comment definitively on what's going on.
So, per my previous post, I was not wholly confident in this theory, largely because warrior testing has indicated some manner of funkiness (particularly at low crit levels), which makes the mechanics there less than wholly understood. Hence, I opted to do some testing on a rogue, given that we have a decent handle on how rogue crit reduction seems to work - Vulajin's initial testing makes it pretty clear that it's a straight 4.8% reduction, even at low levels of crit. So, I hopped on PTR tonight (just in case there are any changes, and to save the cash I spent on un-speccing) and did some testing.
My gear setup had 20 expertise and 0 hit, and I specced out of all hit and expertise talents; thus, while dual-wielding, I have an expected miss rate of 27%, an expected glance rate of 24%, an expected dodge rate of 5.89%, and an expected parry rate of 13.39% (assuming the accepted values of 6.5% base dodge and 14% base parry). Thus, when attacking from the front, the available space for crit and miss is 29.8%. Under the old theory of crit reduction, I'd need at least 4.8% more than this to be capped, or 34.6%. And there is some uncertainty in the glance/parry rate numbers, so to be safe it would be good to have tooltip crit several percent above this. Well, in the test gear, my tooltip crit rate is 43.77%, a good 9% over the amount needed to cap even by the old theory. As such, if I exhibit any regular hits at all while attacking from the front, we know that there's some minimum "hit" rat - and indeed, in 6250 swings, I got 308 hits. Hence, the theory that there's a minimum hit rate seems pretty plausible.
However, there are multiple possible ways that such a minimum could be implemented - the next step is to test the conjecture that the minimum hit rate is precisely the 4.8% crit of crit depression we suffer; if this new theory is correct, we expect our hit table to be as follows:
Using these, we can make a table of observed and expected attack counts for each result:
Clearly the agreement is pretty good - but *how* good? Well, fortunately there's an easy statistical test for this - we have 5 degrees of freedom, and a Chi-squared value of 4.263, which works out the a tail probability of 51.22%. Or, in English: assuming our theory is correct, we will get data with at least this much variance just over half the time, and less variance just under half the time. This is what's known in the business as "a stupidly good result". Basically, we couldn't ask for a better match to our theory than what we have. As such, while we certainly can't definitively prove our theory, it's definitely looking pretty good.
As such, the funkiness that warriors have been seeing notwithstanding, it's reasonably probable that rogues, at least, are having crit reduction applies as a forced conversion of 4.8% of our combat table from crits to hits, and that these 4.8% hits cannot be removed through any means - that is, the proposed theory is looking pretty good.
I do think there are still open questions in terms of what happens to warriors at low levels of crit (and similar effects) - but I'm at least feeling a bit more comfortable on the whole about this theory.
My gear setup had 20 expertise and 0 hit, and I specced out of all hit and expertise talents; thus, while dual-wielding, I have an expected miss rate of 27%, an expected glance rate of 24%, an expected dodge rate of 5.89%, and an expected parry rate of 13.39% (assuming the accepted values of 6.5% base dodge and 14% base parry). Thus, when attacking from the front, the available space for crit and miss is 29.8%.
Don't you mean "available space for crit and hit is 29.8%"?
Ignore the italicized part, see edit. The question is, I think, whether those 4.8% forced hits come from our crits or simply exist in the table the same way glancing blows do. In other words, if your crit rate had been exactly 24.92%, would you have observed 4.8% hits or 9.6%? This does matter because it would mean the crit cap in your example is effectively 24.92%, not 29.72% (or as Mavanas put it, 9.6% lower than previously thought).
If your testing answered that, I apologize, because I'm not seeing it.
EDIT: Having re-read Vulajin's testing, the conclusion was that he was not getting any crits with 4.8% crit chance on the paperdoll, so those forced hits do indeed come from our crits. Which means the crit cap is 4.8% lower than previously thought, not 9.6%. In your previous example, the crit cap would be 29.72%.
Last edited by ShadowEric : 11/30/09 at 12:45 PM.
Reason: Re-read old testing
Okay, to clarify: yes, I meant crit and hit, not crit and miss. I'm not entirely sure what you're trying to say in the second part, so let me just review the logic to be clear what we think is happening.
Based on accepted values and testing, we have 27% miss, 24% glance, 13.39% parry, and 5.89% dodge. The total amount of hit table taken up by these options is 70.28%. Thus, there is 29.72% remaining for the last two outcomes, crit and hit. Thus, if no crit reduction existed, and there was no minimum on the number of hits we get, we'd expect all 29.72% of the table to be crits. Since tooltip crit rate is 43.77%, even if crit reduction were as high as 14.05%, we'd still expect to see no plain hits. Since we *do* see hits, we are forced to conclude that we *are* crit-capped, but are still seeing some hits anyway.
The obvious followup question is "how many" - well, the data (which matches hellord's warrior testing) is that there's around 5%. And we *know* that rogues experience an across-the-board crit reduction of 4.8% against boss level mobs. And it seems mighty suspicious that the number of hits we're seeing is very very close to our crit reduction.
Thus, the conclusion we draw is that crit reduction is implemented by a forced conversion of 4.8% crits to hits, which cannot be removed. Or, phrased alternately: the crit reduction is applied *after* crit capping is considered - which is perhaps the simplest way of seeing what the numbers should be. The logic then looks as follows: we've used up 70.28% on glance, miss, dodge, and parry. Thus, our crit is capped at 29.72%, and we have enough crit to hit that. And then, after our crit has been reduced from 43.77 to 29.72 due to crit cap, then and only then does crit reduction kick in and convert 4.8% crits to 4.8% hits, leaving 24.92% crit and 4.8% hit - which gives us exactly the hit table listed above.
I was trying to figure out if we should gear towards those 29.72% to hit the cap, or simply 24.92% since 4.8% won't crit anyway. It turns out that those 4.8% forced hits are converted from our crits, they're not simply always there on the attack table, so we do need to gear all the way to 29.72%. If we only geared to 24.92%, we'd see 4.8% hit plus another 4.8% from the forced conversion, for a total of 9.6% hits.
This, I think, is Mavanas' misunderstanding, above, which I was trying to clarify.
Correct. If we geared to 24.92, we wouldn't reduce at all to the crit cap and still be at 24.92 - and *then* the 4.8% reduction kicks in and drops our actual crit to 20.12.
In short: the crit cap would now appear to occur when your "tooltip" crit is 100-24-dodge-miss. I say "tooltip" because it does need to include crit-increasing debuffs on the boss, so it won't be the number actually in your tooltip - the point is that it's the value without worrying about any crit reduction. Once we get to that point, we do start losing crit to the crit-capping check, which is performed "before" the 4.8% crit reduction. So, say, if our miss is 10% and we're at the dodge cap for expertise, when our "tooltip" crit reaches 100-10-24 = 66%, we are crit capped, even though this only works out to 61.2% *actual* crit due to crit reduction.
It's fairly concrete that crit reduction is due to the difference in levels between the attacker and the boss-level mob. It seems that the absolute test between crit depression and an inherent combat table issue would be to test on level 80/81/82 targets. Ferals cannot get to the crit required to overcome the lack of glances on the 80 dummy (with Live gear). Can rogues?
Do players glance on level 81 mobs, or just bosses? Is there a good place to find only level 81 mobs to test against them?
I think I recall glances on equal level mobs. If I am remembering correctly, I did a little testing awhile back and decided that it was probably 6% vs 80; 12% vs 81; 18% vs 82. I didn't spend a lot of time on it, but it appeared to be the trend.
Assuming one can hit the crit cap on a 80 target dummy with only 6% glance; 5% parry; 5% dodge, it would determine if this 5% hit is ever-present or based on boss level.
Such testing would almost certainly need to involve a dual-wielding class, as the extra 19% miss rate is going to be pretty important for getting there; even so, even assuming no hit or expertise, it's going to take around 60% tooltip crit to get there, which I suspect isn't really feasible for anyone unbuffed; however, with raid buffs, such a thing might be possible to put together.