Trying to explain in the following example why you can't multipy the difference of two variances ( variance of (x/y) multiplied by variance of (y) ) to get the difference of x. I know it doesn't work, but can't explain why. Thanks!
| Set 1 | Set 2 | Variance | |
| X | 135 | 75 | 60 |
| Y | 15 | 10 | 5 |
| X/Y | 9.0 | 7.5 | 1.5 |
| Y(X/Y) | 135 | 75 | 7.5 |