**I DON'T HAVE THE SOLUTION**

I've got this far:

For to exist:

For to exist:

I've stated the obvious... Not sure where to go from here.

I did a series of sketches and I came up with this:

which does not look correct at all.

Well, you actually did the right thing in your first couple of steps. You just need confidence in yourself.

f(x) = a - x. Dom = [2, infinity). Range = (-infinity, a - 2]

g(x) = x^2 + a. Dom = (-infinity, 1], Range = [a, infinity)

For f(g(x)) to exist, range of g(x) is a subset or equal to the domain of f(x). Hence, [a,infinity) must be a smaller set than [2,infinity). How is that possible? a must be greater than or equal to 2! In mathematical notation,

.

Similarly, for g(f(x)) to exist, the range of f(x) must be a "smaller set" than the domain of g(x). Hence, (-infinity, a - 2] is 'smaller' as a group than (-infinity, 1]. How is that possible? a - 2 must be less than or equal to 1! In maths notation,

. Or, in other words

So now you just find when both of them hold true, i.e. their 'intersection', which will be when

.

Hope I helped.