The problem states that a particle moves in the XY plane, starting from the origin and reaching the point \((a \sin \theta, a \cos \theta)\). We need to find the distance moved by the particle.
To find the distance moved by the particle, we can use the distance formula between two points \((x_1, y_1)\) and \((x_2, y_2)\) in the plane, which is given by:
\[ d = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} \]
Since the particle starts from the origin \((0, 0)\) and reaches the point \((a \sin \theta, a \cos \theta)\), we substitute these coordinates into the distance formula:
\[ d = \sqrt{(a \sin \theta - 0)^2 + (a \cos \theta - 0)^2} \]
This simplifies to:
\[ d = \sqrt{(a \sin \theta)^2 + (a \cos \theta)^2} \]
Using the Pythagorean identity \((\sin \theta)^2 + (\cos \theta)^2 = 1\):
\[ d = \sqrt{a^2 (\sin^2 \theta + \cos^2 \theta)} \]
\[ d = \sqrt{a^2 \cdot 1} \]
\[ d = \sqrt{a^2} \]
\[ d = a \]
Therefore, the distance moved by the particle is \(a\).