18

I am trying to use the HDMI output on a PC (HP ZBook) with Debian (stretch). I have configured Bumblebee, it works well (glxinfo and optirun glxinfo report the expected information, and I tested complicated GLSL shaders that also work as expected).

Now I would like to be able to plug a videoprojector on the HDMI. I have read here [1] that intel-virtual-output can be used to configure it when the HDMI is connected on the NVidia board (using a VIRTUAL output that can be manipulated by xrandr). However, intel-virtual-output says:

 no VIRTUAL outputs on ":0"

When I do xrandr -q, there is no VIRTUAL output listed, I only have:

Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192
eDP-1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 345mm x 194mm
   1920x1080     60.02*+  59.93  
   1680x1050     59.95    59.88  
   1600x1024     60.17  
   ... other video modes ...
   400x300       60.32    56.34  
   320x240       60.05  
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-2 disconnected (normal left inverted right x axis y axis)
HDMI-2 disconnected (normal left inverted right x axis y axis)

My installed version of xserver-xorg-video-intel is: xserver-xorg-video-intel_2.99.917+git20160706-1_amd64.deb

Update (Sat. Dec. 09 2016) I have updated Debian, and now X crashes when second monitor is active when I starting some applications (for instance xemacs). Sat. Dec. 17 2016: Yes, found out ! (updated the answer).

Update (Wed Sep 27 2017) The method works in 99% of the cases, but last week I tried a beamer that only accepts 50Hz modes, and could not get anything else than 60Hz (so it did not work). Anybody knows how to force 50Hz modes ?

Update (Tue 01 Oct 2019) Argh! Broken again: After updating X and the NVidia driver, optirun now crashes (/var/log/Xorg.8.log says crash in Xorg, OsLookupColor+0x139). Update (07 Oct 2019) Found a temporary fix (updated answer).

[1] https://github.com/Bumblebee-Project/Bumblebee/wiki/Multi-monitor-setup

BrunoLevy
  • 531
  • 1
  • 3
  • 12
  • 1
    Not an answer because this does not use the Nvidia GPU, but as an FYI - with an nvidia 960m optimus laptop (dec 2016 dell inspiron 15) running Fedora 25, i can use the external HDMI monitor with no nvidia or bumblebee drivers installed. With this setup, the external hdmi monitor doesnt get detected if I plug it in after boot. To get this working, plugin the HDMI monitor before boot, and at the grub menu use the Fn-F8 monitor switcher to get a mirrored monitor, login to X with wayland and get an extended monitor. – carlsborg Mar 13 '17 at 00:29

1 Answers1

22

Yes, found out ! To activate VIRTUAL output of the intel driver, you need to create a 20-intel.conf file in the Xorg configuration directory (/usr/share/X11/xorg.conf.d under Debian stretch, found out by reading /var/log/Xorg.0.log)

Section "Device"
    Identifier "intelgpu0"
    Driver "intel"
    Option "VirtualHeads" "2"
EndSection

My /etc/bumblebee/xorg.conf.nvidia is as follows:

Section "ServerLayout"
    Identifier  "Layout0"
    Option      "AutoAddDevices" "true"
    Option      "AutoAddGPU" "false"
EndSection

Section "Device"
    Identifier  "DiscreteNvidia"
    Driver      "nvidia"
    VendorName  "NVIDIA Corporation"
    Option "ProbeAllGpus" "false"
    Option "NoLogo" "true"
    Option "AllowEmptyInitialConfiguration"
EndSection

Section "Screen"
    Identifier "Screen0"
    Device "DiscreteNVidia"
EndSection

Some explanations: it needs a "Screen" section, else it tries to use the Intel device declared in 20-intel.conf (that we just added before, oh my...). It also needs "AllowEmptyInitialConfiguration" to remain able to start with optirun when no external monitor is attached.

With this configuration and starting intel-virtual-output, I was able to access my HDMI port. Yeehaa !!!

Troubleshooting: if optirun or intel-virtual-output do not work, take a look at /var/log/Xorg.8.log (bumblebee creates an X server with display :8 used internally).

Notes I read at several places that KeepUnusedXServer should be set to true and PMMethod to none in /etc/bumblebee/bumblebee.conf, I did not do that and it works fine. If I do that, it works, but then the discrete GPU remains on even after exiting an optirun-ed application or killing intel-virtual-output, which I did not want.

More notes Something else that made me bang my head on the wall was deactivating Nouveau and starting the Intel X server: it needs to be done by flags passed to the kernel, specified in GRUB parameters. In /etc/defaults/grub, I have the following line:

GRUB_CMDLINE_LINUX_DEFAULT="quiet blacklist.nouveau=1 i915.modeset=1 gfxpayload=640x480 acpi_backlight=vendor acpi_osi=! acpi_osi=\"Windows 2009\""

(beware the quotes and escaped quotes).

Some explainations: it avoids loading nouveau (that is incompatible with the Nvidia X server), and tells the Intel driver to go to graphics mode right at boot time. If you do not do that, then the Intel X server cannot start, and it falls back to a plain old VESA server with CPU-side 3D rendering. The acpi_xxx flags are required on this specific machine to overcome a BIOS bug that makes it crashing when going in graphics mode with the discrete GPU off. Note that it is specific to this particular notebook (HP ZBook portable workstation), it may be unnecessary or differ for other laptops.

Update (Dec 6 2017) With the latest Debian distro (Buster), "915.modeset=1 gfxpayload=640x480" is unnecessary. To remove nouveau, I needed also to create a nouveau.conf file in /etc/modprobe.d with "blacklist nouveau" in it, then recreate the ramdisk with "update-initramfs -u". Reboot and make sure "nouveau" is not loaded anymore with "lsmod |grep nouveau".

Update (Dec 17 2016) With the latest xorg-server (1.19), there seems to be a problem in a RandR function that manages Gamma when used with intel-virtual-output. Here is the procedure to patch the Xserver and get it to work:

sudo apt-get build-dep xserver-xorg-core
apt-get source xorg-server

edit hw/xfree86/modes/xg86RandR12.c Line 1260, insert "return" (so that the function xf86RandR12CrtcComputeGamma() does nothing)

dpkg-buildpackage -rfakeroot -us -uc
cd ..
sudo dpkg -i xserver-xorg-core_n.nn.n-n_amd64.deb

(replace the n.nn.n-n with the correct version), reboot and Yehaa !! works again ! (but it's a quick and dirty fix)

Update filed a bug report (was already known, and was just fixed): https://bugs.freedesktop.org/show_bug.cgi?id=99129

How I figured out: Installed xserver-xorg-core-dbg and did gdb /usr/lib/xorg/Xorg <xorg pid> from another machine through ssh.

Update (Jan 11 17) Seems that the bug is now fixed in the latest Debian packages.

Update (Jan 24 18) When you want to plug a beamer for doing a presentation and need to configure everything right before starting (intel-virtual-output + xrandr), it can be stressful. Here is a little script that does the job (disclaimer: a lot of room for improvement, regarding style etc...):

# beamer.sh: sets Linux display for doing a presentation, 
#  for bumblebee configured on a laptop that has the HDMI
#  plugged on the NVidia board.
#
# Bruno Levy, Wed Jan 24 08:45:45 CET 2018
#
# Usage: 
#    beamer.sh widthxheight
#    (default is 1024x768)


# Note: output1 and output2 are hardcoded below,
#  change according to your configuration.
output1=eDP1
output2=VIRTUAL1

# Note: I think that the following command should have done
# the job, but it does not work. 
#    xrandr --output eDP1 --size 1024x768 --output VIRTUAL1 --size 1024x768 --same-as eDP1
# My guess: --size is not implemented with VIRTUAL devices.
# Thus I try to find a --mode that fits my needs in the list of supported modes.

wxh=$1

if [ -z "$wxh" ]; then
  wxh=1024x768
fi

# Test whether intel-virtual-output is running and start it.
ivo_process=`ps axu |grep 'intel-virtual-output' |egrep -v 'grep'`
if [ -z "$ivo_process" ]; then
   intel-virtual-output
   sleep 3
fi

# Mode names on the primary output are simply wxh (at least on
#  my configuration...)
output1_mode=$wxh

echo Using mode for $output1: $output1_mode

# Mode names on the virtual output are like: VIRTUAL1.ID-wxh
# Try to find one in the list that matches what we want.
output2_mode=`xrandr |grep $output2\\\. |grep $wxh |awk '{print $1}'`
# There can be several modes, take the first one.
output2_mode=`echo $output2_mode |awk '{print $1}'` 

echo Using mode for $output2: $output2_mode

# Showtime !
xrandr --output $output1 --mode $output1_mode --output $output2 --mode $output2_mode --same-as $output1

update (10/07/2019)

A "fix" for the new crash: write what follows in a script (call it bumblebee-startx.sh for instance):

optirun ls # to load kernel driver
/usr/lib/xorg/Xorg :8 -config /etc/bumblebee/xorg.conf.nvidia \
 -configdir /etc/bumblebee/xorg.conf.d -sharevts \
 -nolisten -verbose 3 -isolateDevice PCI:01:00:0 \
 -modulepath /usr/lib/nvidia/nvidia,/usr/lib/xorg/modules/

(replace PCI:nn:nn:n with the address of your NVidia card, obtained with lspci)

Run this script from a terminal window as root (sudo bumblebee-startx.sh), keep the terminal open, then optirun and intel-virtual-output work as expected (note: sometimes I need to run xrandr in addition to make the screen/videoprojector detected). Now I do not understand why the very same command started from bumblebee crashes, so many mysteries here ... (but at least it gives a temporary fix).

How I figured out: wrote a 'wrapper' script to start the xserver, declared it as XorgBinary in bumblebee.conf, made it save the command line ($*) to a file, tried some stuff involving LD_PRELOADing a patch to the XServer to fix the crash in osLookupColor (did not work), but when I tried to launch the same command line by hand, it worked, and it continued working without my patch (but I still do not understand why).

Update 11/15/2019 After updating, I experienced a lot of flickering, making the system unusable. Fixed by adding kernel parameter i915.enable_psr=0 (in /etc/defaults/grub, then sudo update-grub). If you want to now, PSR means 'panel self refresh', a power-saving feature of intel GPUs (that can cause screen flickering).

BrunoLevy
  • 531
  • 1
  • 3
  • 12
  • 3
    You, I like you! :D I'm on Arch linux, I wasn't able to start `intel-virtual-output`. was exiting with exit code `111`. I was even trying to figure out what it was doing with `strace`. Found your post, I created that `/etc/X11/xorg.conf.d/20-intel.conf` just like yours, also added my `BusID "PCI:2:0:0"` just in case. Used similar settings `/etc/bumblebee/xorg.conf.nvidia` and bam, `intel-virtual-output` runs fine. `xrandr` is now detecting all of my other ports! :D <3 the Troubleshooting part about `/var/log/Xorg.8.log` really helped – GabLeRoux Jan 11 '17 at 00:46
  • Ok after testing a bit, yes I see the devices listed with `xrandr`, but hooking the cable shows me the mouse, but windows are not showing on external display. Sometimes it shows a copy of my main screen and then both screens go black and nothing happens. and turned out system dosn't start with `/etc/X11/xorg.conf.d/20-intel.conf` and I don't see anything in the logs. Anyway, thanks for sharing this, I suppose I'm getting closer to a solution. – GabLeRoux Jan 11 '17 at 01:14
  • It may depend on the desktop environment that you are using. I'm using KDE, and needed to tweak a little bit the screen configuration in the preferences. – BrunoLevy Jan 11 '17 at 13:53
  • I think I got it working now. It was failing with my tv, but it's working with my displayport :) – GabLeRoux Jan 11 '17 at 16:27
  • Would you care to expand on your instructions a bit? The command now works, ```intel-virtual-output``` and xrandr reports 2 virtual outputs, but my HDMI still does not work. Is there any step I need to alter? – fgblomqvist Mar 22 '17 at 05:18
  • EDIT: Commenting out ```Option "UseDisplayDevice" "none"``` and re-running intel-virtual-output got the monitor up and running. It is detected as an unknown one though so will have to do some manual configuring. – fgblomqvist Mar 22 '17 at 05:27
  • I'm using the "Systems settings" application to configure the HDMI. Once intel-virtual-output is started, it recognizes everything (as if it was a standard configuration). – BrunoLevy Mar 22 '17 at 09:37
  • @fgblomqvist I followed your advice, but after running `intel-virtual-output` it fails with `XIO error on display :8` and aborts. Any tips? – Milwaukoholic Nov 30 '17 at 03:22
  • @Milwaukoholic unfortunately I ran into a dead end with this. I did not get further than what I wrote above. The monitor was stuck at a subpar resolution and with tons of other issues. The way I worked around my main issue was to use the intel graphics + nvidia with bumblebee + HDMI/displayport through TB3. It works flawlessly for running certain applications on the dedicated card and using an external monitor. The HDMI port is unfortunately unusable (still need to fully switch to nvidia for it to work), but as long as it works through TB3 I'm satisfied. Good luck! – fgblomqvist Nov 30 '17 at 06:55
  • @Milwaukoholic, did you find some info. in /var/log/Xorg.8.log ? – BrunoLevy Nov 30 '17 at 08:05
  • @fgblomqvist, do you know how to easily switch between a pure NVidia xconfig and a bumblebee xconfig ? (ideally, I'm looking for a shell script that would switch the config. It is OK if rebooting is required). – BrunoLevy Nov 30 '17 at 19:15
  • @BrunoLevy Sorry I don't. I set it up this way: https://help.ubuntu.com/community/RazerBlade#SwitchableGraphics . As you can see, all you do is install bumblebee and configure it. You don't tamper with X. I suppose one way of switching would be to disable bumblebee, prime-switch to Nvidia, and then reboot. But then again, I don't see why you would wanna switch when Bumblebee gives you both? – fgblomqvist Nov 30 '17 at 23:20
  • @fgblomqvist, I'd like to do that because nvidia-settings is sometimes more robust with certain beamers that I did not manage to talk with through VIRTUAL outputs (but I'm a bit afraid of wrecking my X configuration when trying to have both). – BrunoLevy Dec 01 '17 at 09:41
  • @BrunoLevy ah alright. Well good luck, X is a fragile thing (can't wait until Wayland is ready) – fgblomqvist Dec 01 '17 at 18:27
  • You sire are a life saver, didn't expect this to work, been trying to get intel-virtual-display work on my inspiron since I got it. YOUR CONFIG IS MAGIC – HackToHell Apr 18 '18 at 05:44
  • Your configuration is as close as I've gotten to getting an HDMI monitor to work after countless wasted hours over 8 years!! I'm stuck, though, after running `intel-virtual-output`, `xrandr` shows two VIRTUAL outputs connected. I assume one is my built-in laptop monitor and the other is the HDMI monitor I have connected. However, my HDMI monitor remains off and there's nothing I can do to activate it. Any advice? – Joseph R. Sep 09 '19 at 06:53
  • @Joseph R. You are nearly there ! (yes, normally VIRTUAL1 will correspond to your HDMI) Once you have started intel-virtual-output, the easiest way is to use the 'systems settings' GUI, and activate the monitor from there. You can also use my script (but may require adaptation to your config). – BrunoLevy Sep 09 '19 at 08:48
  • 1
    @BrunoLevy It finally works. Thank you for giving my old laptop a new lease on life! – Joseph R. Oct 18 '19 at 20:41
  • Somebody know how to chose the GPU connected to an USB-C adapter ? : https://unix.stackexchange.com/questions/548875/laptop-with-nvidia-and-intel-gpus-choose-the-graphic-board-connected-to-a-usb-c – BrunoLevy Oct 29 '19 at 09:50
  • What about performance? On my laptop virual output was working terribly slow... – radrow Jan 19 '20 at 22:31