6

I have a Lenovo Yoga 2 11" convertible laptop, and it has a touch screen that xinput tells me is:

> xinput list
⎡ Virtual core pointer                      id=2    [master pointer  (3)]
⎜   ↳ Virtual core XTEST pointer                id=4    [slave  pointer  (2)]
⎜   ↳ ETPS/2 Elantech Touchpad                  id=13   [slave  pointer  (2)]
⎜   ↳ Atmel Atmel maXTouch Digitizer            id=10   [slave  pointer  (2)]
⎣ Virtual core keyboard                     id=3    [master keyboard (2)]
    ↳ Virtual core XTEST keyboard               id=5    [slave  keyboard (3)]
    ↳ Power Button                              id=6    [slave  keyboard (3)]
    ↳ Video Bus                                 id=7    [slave  keyboard (3)]
    ↳ Power Button                              id=8    [slave  keyboard (3)]
    ↳ Lenovo EasyCamera                         id=9    [slave  keyboard (3)]
    ↳ Ideapad extra buttons                     id=11   [slave  keyboard (3)]
    ↳ AT Translated Set 2 keyboard              id=12   [slave  keyboard (3)]

(It's the "Atmel Atmel maXTouch Digitizer"). It works fine in Ubuntu and Fedora, but in Arch Linux, the touch screen acts like a mouse. It has a pointer, and multi-touch isn't recognized. How can I get it to work?

Jonathan
  • 1,210
  • 4
  • 23
  • 41
  • You have the answer [here](http://unix.stackexchange.com/questions/129339/touchscreen-and-mouse-as-separate-inputs). I also use Arch :) – Alko Feb 15 '15 at 19:41
  • Afterwards you will also probably want [Touchegg](https://wiki.archlinux.org/index.php/Touchegg). And can you notify me, If you manage to get it work in KDE. Thanks! – Alko Feb 15 '15 at 19:43
  • @Alko, that question is about how to make a touchscreen a seperate input from a mouse, so that you can use the touchscreen and mouse at the same time. It doesn't solve my problems of 1) having a mouse-like pointer when I use the touch screen, and 2) not having multi-touch working. Toucchegg is great on Ubuntu, but doesn't recognize my touchscreen as a touchscreen in Arch, because of the problem I describe here. – Jonathan Feb 15 '15 at 21:35
  • but the post I sent you talks about separating touchscreen from the mouse, so they get recognized as separate inputs. Isn't that what you're looking for? And for multitouch support you have touchegg – Alko Feb 16 '15 at 00:01
  • 1
    @Alko: No. I don't want to be able to use my mouse and touchscreen at the same time, as separate inputs. I want to be able to use my touch screen as a touch screen. The problem is, it's recognized as a mouse. Touch screens should't have pointers, and they should have multitouch enabled. Touchegg isn't a driver, and can't magically enable multitouch unless there is driver support. What I need is some sort of xinput or driver solution that will help get my touchscreen recognized as a touchscreen. – Jonathan Feb 16 '15 at 22:39
  • you have to separate them, before you can do that...and I think Its already recognized as multitouch input..you can test it with cat in dev. – Alko Feb 17 '15 at 09:44
  • Where in dev can I find that? I tried catting `/dev/input/by-id/usb-Atmel_Atmel_maXTouch_Digitizer-event-if00`, but its output isn't human-readable. If it is already multitouch, touchegg doesn't seem to recognize it as such. – Jonathan Feb 19 '15 at 02:47
  • I know its not, but at least you see that you get something different if you put more fingers on the screen. I didn't get the time to setup touchegg yet. – Alko Feb 19 '15 at 09:38
  • 1
    Here's why I don't think multitouch works: I've visited a few multitouch testing web apps, like [this one](http://scripty2.com/demos/touch/), and they can only detect mouse clicks, and no touch events. Touchegg doesn't respond at all to gestures, either. I do see extra garbage if I put more fingers on the screen, but I don't think that difference means mt works, since if it were, multitouch-capable applications would recognize those actions as gestures, right? I can separate my devices, and have two pointers that are recognized as two mice, but still can't get my touch screen to work as such. – Jonathan Feb 27 '15 at 14:47
  • No, they would not. Touching the screen with a finger is "the same" as clicking it with a mouse. Its like a large touchpad. That is why your system handles your touchscreen as if it was a mouse. You have to explicitly tell the system, that this is not a mouse and that it should use different backend for it (that's why you have to first separate those inputs, so you can handle them differently as the mouse). Once you separate them, you have to set them up. But unfortunately I didn't have the time to do that yet. – Alko Feb 27 '15 at 17:36
  • @Jon I have a Dell XPS 13 developer with Fedora and get precisely the same behaviour of the touchscreen (touch = mouse click, long touch = right click, no multitouch, mouse pointer follows touches) – guido Mar 05 '15 at 11:23

0 Answers0