Technically, I am working on an optimal control problem. However, through some trickery, I managed to eliminate the dynamics. What I am left with is the following minimization problem:
$$ \min_{g \in L^1} \int_0^\infty \big[\alpha(sm) - g \left( s \cdot \|g\|_1 \right) \big]^2 {\rm d} s $$
Here, $\|g\|_1 := \int_0^\infty |g(s)| ds$, and $m$ is simply a constant. Further, $\alpha$ is a function that can be assumed to be arbitrarily smooth, though minimally $L^1 = \{f : \|f\|_1 <\infty \}$. I have the following questions:
How do you solve this problem?
Does this problem fall into a class of larger problems that have a name (Quite frankly I have never seen this implicit dependence before)?
Is there literature that discusses these kinds of problems?
How hard is this equation to solve numerically? Just a ballpark, like something between "takes a minute on your local laptop" and "Ask your university for more compute".
Naturally one would hope that the equation $$ \alpha(sm) = g(s\cdot \|g\|) $$ is solvable for $g$. In my mind this can not be done for general $\alpha \in L^1$. However, if somebody can proof me wrong here that would also be useful, since it means that I did something significantly wrong in my derivation.
With "these kinds of problems" I always refer to minimization problems over function spaces that also have the implicit relation $g(s \cdot \|g\|)$.