Measuring distance between object and touch on screen
Hi there, I'm using Pythagorean Theorem to calculate the distance between where a player touches the screen and an object on the screen. ie. a2 + b2 = c2. This works fine on a scene that is the same size as the screen.
However, on a scene larger than the screen size things go wrong. Let's use the x co-ordinate to illustrate the problem. I'm using self.position.x to get the location of the object - this keeps increasing as it moves across the scene. And I'm using game.mouse.position.x to get the location of where the touch is on the screen - this remains the same even if the scene is moving. Therefore, as the object moves right, the distance between these two points grows.
I think I need to change one of these two to mirror the other. But I'm not sure how. Any suggestions?
Thanks in advance,
James
Comments
THere's a function to do what you're doing - magnitude. It'll tell you the distance between two points.
You need to include the camera offset as well if you're moving around the scene.
camera.origin.x+touches.touch1.x
etc.
Contact me for custom work - Expert GS developer with 15 years of GS experience - Skype: armelline.support
Brilliant, thanks Armelline!
One further question... how do I calculate the camera offset? I can't figure that out.
Very crude demo.
Contact me for custom work - Expert GS developer with 15 years of GS experience - Skype: armelline.support