I saw this problem and it immediatly took me back to my undergrad days in which I had an optimal control class, where one of the topics studied is minimizing expressions of this kind by abstracting the problem to a tool called functional ([0], [1]) and then applying different methods of resolution, a combination of Lagrange multipliers and the Euler-Lagrange equation [2].<p>In this particular example, we take the expression to be minimized and then sum the restricions scaled by parameters λ1 and λ2. It should look like this (y=f(x)): ∫ y^2+λ1y+λ2xy dx integrated from 0 to 1. Then, we grab what is inside and apply the Euler-Lagrange equation: L(x,y)=y^2+λ1y+λ2xy ⇒ 2y+λ1y+λ2x=0. So now we now f(x) is linear (which is why the method in the post works, if f(x) was a polynomial of higher order, then it should have been substracted a poylynomial in x other than ax+b). Solving for y and substituting in the restrictions, we integrate linear and quadratic functions (easy) to arrive at a system of equations for λ1 and λ2. Solving we get (λ1,λ2)=(4,-12). Then f(x)=6x-2 is the minimizer for the functional. Integrating f(x)^2 we get that its integral is 4, so for any f(x) satisfying those two conditions, ∫ f(x)^2 dx is greater than or equal to 4, and that lower bound is achieved for f(x)=6x-2.<p>I didn't mention convexity, which is what allows us to conclude that the inequality is in that direction, but I leave it to the reader.<p>[0] <a href="https://en.wikipedia.org/wiki/Functional_(mathematics)" rel="nofollow">https://en.wikipedia.org/wiki/Functional_(mathematics)</a><p>[1] <a href="https://en.wikipedia.org/wiki/Functional_derivative" rel="nofollow">https://en.wikipedia.org/wiki/Functional_derivative</a><p>[2] <a href="https://en.wikipedia.org/wiki/Euler%E2%80%93Lagrange_equation" rel="nofollow">https://en.wikipedia.org/wiki/Euler%E2%80%93Lagrange_equatio...</a>