If \(g'\left(\frac {3}{2}\right) = g'\left(\frac {1}{2}\right)\) and \(f(x) = \frac {1}{2} [g (x) + g(2-x)]\) and \(f'\left(\frac {3}{2}\right) = f'\left(\frac {1}{2}\right)\) then
(1) f''(x) = 0 has exactly one root in (0, 1)
(2) f''(x) = 0 has no root in (0, 1)
(3) f''(x) = 0 has at least two roots in (0, 2)
(4) f''(x) = 0 has 3 roots in (0, 2)