^ yeah I think that works.
Express Beta distribution in terms of Standard Normal variables:


Check Johnston and johnston

OP here. Thank you very much for your proposed solution !
I am trying now (with books and friends who are better at matrix algebra than me) to understand what you are proposing...
I think what you have in mind is that so called Helmert's transformation, e.g., Rao 1973, Linear Statistical Inference with Applications, p.183.
^ I am aware of the relationship that you are mentioning, and obviously I read the Wikipedia before I asked the question.
Independence is missing in this whole story, that is, If Chi2(a) and Chi2(b) are INDEPENDENT, then B~Beta(a/2,b/2).
Now in the expression above it is easy to see that upon moving the n in the denominator to the numerator sum (Z1+Z2+ ... +Zn)^2/n~Chi(1).
Then the denominator Z1^2~Chi(1), and Z2^2+ ... +Zn^2~Chi(n1).
So we sort of get an expression that looks like a beta variable, Chi(1)/(Chi(1)+Chi(n1). however there are two problems:
1. The Chi(1) in the numerator is not the same as the Chi(1) in the denominator
2. The Chi(1) in the numerator is NOT independent of the Chi(n1) in the denominator...you can represent that as a
Chi(1)/(Chi(1)+Chi(n1))
by noticing that for an orthonormal matrix Q, QZ is also normal(0,identity). This rotation doesn't affect the sum of the squares, so the denominator is unchanged, but you could select the first row of your orthonormal matrix to be
(1/sqrt(n))*ones 
^ It's pretty straightforward, you don't need to look up references or anything. You can find an orthonormal matrix Q such that the first coordinate of Q(X) is (X_1 + ... + X_n)/sqrt(n), then your expression is just Q(X)_1^2/X^2 = Q(X)_1 ^2 / Q(X)^2. Or if you set Y = Q(X), you have Y_1/Y^2 with Y = (Y_1,...,Y_n) and all iid standard normal.

OP literally retarded
Normal people have nothing to do with betas

And you are gay.
OP literally retarded
Normal people have nothing to do with betas