The mouse chapter of ncurses programming HOWTO might be useful reading.
The very first subchapter introduces all the mouse event masks: there is mouse position reporting, which includes x, y, and z coordinates, so one of them can be used to cover one scroll wheel. There are events for up to four buttons, and double- and triple-clicks for each of them. There are events for the use of Shift, Control and Alt keys as modifiers for click events. And that is all the information you can get through the ncurses mouse API.
But that document is more than 10 years old - perhaps it's not up to date?
Nope. Searching for word "touch" in /usr/include/ncurses.h produces several hits, but cross-referencing to the corresponding man pages indicates all of them refer to making decisions whether the content on screen requires refreshing or not. Nothing touchscreen-specific is available.
Even the latest version 6.1 brings no more advanced touchscreen support.
If you don't find what you're looking for after checking the sources I mentioned above, you can pretty conclusively determine that what you're seeking does not exist.
This is just my opinion, but it seems to me that multitouch and gestures would often require some graphics capability that would allow zooming and rotating some on-screen elements pretty freely, to just make it apparent that the gestures are doing something. A character-based display could be a poor fit for such an interface, so that's probably why nobody has made the effort to add specific touchscreen support yet.
But it sounds like you might have some novel ideas regarding touch user interfaces on text-based displays. You could perhaps prototype your ideas by having your demo application interface more directly with the actual touchscreen driver.
Once you have some idea on what kind of touchscreen UI elements actually work with text-bases displays (maybe recruit some university students to do a basic usability research project for you?), the next question will be: what kind of abstraction of touchscreen events would be most useful for implementing those?
It may turn out that there is no added value in passing the touchscreen events through ncurses itself: instead, you might find it appropriate to create a separate touch UI element library that connects to ncurses for display and the touchscreen driver for input. Who says you must use just one programming API at a time?