I think you're pissing against the wind here. All sorts of things could happen depending on the functional forms of f and g. Remember that nonlinear functions include trig functions and all other sorts of weirdness.

(1) If x1 is a function of the instrument z1, then by construction x1 is correlated with x2 since z1 is a valid instrument for x2, and you're going to have multicollinearity issues in small samples. OLS is still unbiased and inconsistent in the presence of multicollinearity (if nothing else is wrong with the regression, but in your case we still have the endogeneity problem with x2), but the variances will be high. I don't think anything else bad happens.

(2) This is a case of model misspecification. Remember that the conditional expectation function (the regression equation) is always linear in the special case where you have saturated the regression; otherwise, you're imposing a linear approximation on a function that may or may not be truly linear. You've stumbled on a key point of the reduced form versus structural econometrics debate: reduced form regressions aren't "atheoretical" since you're imposing a linear functional form. If you estimate y = b0 + b1*x1 when the true regression is y = b0 + b1*(x1^2), the former regression just gives a best linear approximation of the effect of x1 on y. Also recall that the coefficients in a regression give the mean effect of x on y (if you want the median effects or effects at certain quantiles, you use quantile regression). To summarize, I don't think anything "bad" happens for the case of (2), since a regression necessarily only estimates a linear approximation of the average effect of x on y. Beta 1 in the true regression and beta 1 in the estimated regression in my example are not comparable, I think.

That's the best I could do. Good luck.

- Stata Monkey