Hello!
I'm making a 2D pixel art game, and I like the effect created by a low follow speed on the camera, where the player stops moving and the camera slowly comes to rest too. The problem is that the very last stretch of the stopping motion introduces a lot of jitter that is very apparent when you are using pixel art.
Correct me if I'm wrong, but it appears to me that the camera is programmed to move by increasingly smaller distances the closer it gets to the resting point (and those distances are determined by the follow speed). For instance, if the camera is 6 units away from its destination, it might move at 3 units/second; when it's 3 units away, it might move at 1.5 units/s; when it's 1.5 units away, it might move at 0.75 units/s; and so on. Most of this movement looks good! The problem happens when the camera is very close to its destination, but still moving by very, very small distances - the camera already looks still, but Unity doesn't quite know where to render each individual pixel because they are bigger than the movement itself.
I was wondering if there wouldn't be an easy solution for this? For instance, by giving us a "snapping distance" value on the editor? The camera would slowly move to its destination (say, 0x 0y) like it always has, but once it arrived at a user-defined value (say, 0.1x 0.1y), it would snap to 0x 0y instantly instead of moving slowly between those two values.
Comments
The issue you're describing is common in operations that use Unity's Lerp function. For some operations, such as character movement, AC uses a built-in lerp process that aims to reduce this behaviour. I'll see if it can be similarly applied to 2D cameras.
Thanks for the "stepped" suggestion. I'll consider it.
To answer your question, my camera is Orthographic.