Unity basic single-point and multi-touch

To sum up the touch events

The simplest single-click touch of the mouse can also realize touch-screen operation on Android and IOS

OnMouseDown: the mouse is pressed, triggered at the moment of the click

OnMouseDrag: Triggered when the mouse continues to be pressed and dragged

OnMouseEnter: Triggered when the mouse passes over an object

OnMouseUp: Release the mouse after clicking, trigger when it ends

Wait for a series, and trigger click events on the UI, which may be overwritten by some click or drag events that come with the UI. You can refer to the previous APP sliding page, nested sliding list to achieve https://blog. csdn.net/weixin_45081191/article/details/128456026?spm=1001.2014.3001.5502 has a specific implementation method.

However, multi-touch screens, such as two-finger zoom in and out, two-finger shortcut operations and other extended operations, require the use of Unity's Input class, which can obtain an array of all points currently touched by the player.

All touch screen points can be obtained through the Input.touches array.

Touch.position can obtain the position of the touch point, and realize the operation judgment of the player's gesture by changing the position.

The enum for the touch phase is interpreted as follows:

Bega: Finger just touches the screen to trigger

Moved: Triggered by finger moving on the screen

Stationary: the finger touches the screen, and then stops for a period of time to trigger

Ended: Triggered by finger leaving the screen

Canceled: The system cancels the touch tracking, such as putting the device on the face or more than 5 touch points at the same time

Guess you like

Origin blog.csdn.net/weixin_45081191/article/details/128856811