It would be nice to be able to get X/Y coordinates reported when you touch-hold a component and move your finger/stylus (e.g. for dragging objects, drawing, 2D sliders/joypads ...)
The display sports a "sendxy" command which could be used, but it would be nicer if this could be handled without an MCU and confined to the area of a component.
I could imagine some reporting modes for this:
1. Report X/Y (relative to first touch - must allow negative readings)
a. as quick as possible
b. only every n milli seconds
c. only every n pixels travelled
2. Report X/Y (relative to given point in component (default 0/0) - must allow neg.)
a., b., c., d. as above
1. could be managed via internal code on TouchPressEvent by setting anchorX/anchorY if the touch coordinates could easily be retrieved via internal code.
It's a pain in the ass to send x/y through serial and then send it back to the Nextion to make internal code process the touch position.
I am now reviewing all of the Feature Requests, this will take some time, patience please.
I think perhaps making xc, yc system variables - users can code as desired.
concept - carried forward