Raycasts are extremely versatile. With raycasts you can see if there is something under the mouse, in the path of a straight shooting object, or just about anything else regarding something being in the path of something else. As far as calculations go, raycasts are probably one of the most expensive (in terms of computational time) things that you can do. But how expensive are they? Well, let's put things into perspective about just how expensive raycasts are. We'll look at the basics of raycasting and how it is used with great frequency in a standard setting.
What exactly is a raycast, how does it work, why is it so expensive? A raycast has an origin and a direction, like any vector, and is literally cast, or thrown if you will, into the scene. The process basically involves going through every object, then checking ever vertex, face, and edge, for an intersection. Even with rigorous optimization this can be checking against hundreds of elements, and this is what makes it computationally expensive.
How it's Used
While it is relatively expensive, it is exactly how the scene is rendered. On my desktop that means 1680 x 1050 = 1,764,000 raycasts per frame just to display what is going on. On my tablet it's 1920 x 1200 = 2,304,000 raycasts per frame and 960 x 540 = 518,400 for my phone. So while it is an expensive calculation, remember that it is happening half a million to two million times a frame just to render, so doing it another 30 or so times is no worse than a 6 x 5 grid of extra pixels.