I just recently decided to try Prime (NVIDIA offloading applications to its gpu, while system using integrated gpu, in my case Intel HD630) on my Slackware-current laptop, and soon I discovered that I couldn't use my HDMI port while running on integrated intel gpu since that port was connected to the nvidia gpu card, which was now enabled only on demand. The solution I soon found out to be the new Reverse Prime, but I can't seem to make it work because strange things are happening under the hood. I'm looking for help trying to figure out what is happening.
At first, Prime apparently worked fine on laptop screen (I first used it to run a GPU intensive application, it worked as expected, full performance). You run the application you want to run on NVIDIA gpu prefixing it with some environment variables. But later when I tried __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxinfo for a test I noticed it was not working specifically with glxinfo, giving this error:
name of display: :0 X Error of failed request: BadValue (integer parameter out of range for operation) Major opcode of failed request: 151 (GLX) Minor opcode of failed request: 24 (X_GLXCreateNewContext) Value in failed request: 0x0 Serial number of failed request: 39 Current serial number in output stream: 4
And then i tried nvidia-settings --verbose with and without the environment variables, also getting an error:
WARNING: NV-CONTROL extension not found on this Display. ERROR: Unable to load info from any available system ERROR: Unable to assign attribute SyncToVBlank specified on line 22 of ........ ERROR: Unable to assign attribute FSAA specified on line 24 of ........ (repeat similar message to several other video settings previously saved on nvidia-settings-rc file) ERROR: Unable to assign attribute SynchronousPaletteUpdates specified on line 38 of configuration file '/[home path, redacted]/.nvidia-settings-rc' (no Display connection).
Running nvidia-smi works normally giving accurate information on nvidia gpu (weirdly with no need to run with the environment variables). I checked glxinfo without nvidia variables, wondering if my primary gpu was really Intel by now, and it was indeed the integrated intel gpu. Then I run xrandr --listproviders and got this:
Providers: number : 2 Provider 0: id: 0x43 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 3 outputs: 1 associated providers: 1 name:modesetting Provider 1: id: 0xca cap: 0x2, Sink Output crtcs: 4 outputs: 1 associated providers: 1 name:modesetting
Note that it detects both gpu but the second one (Provider 1) should be named NVIDIA-G0 but it was also set to modesetting, like Provider 0 (the intel igpu). I then configured xorg.conf for a quick temporary test, to start X with NVIDIA gpu as primary, and run xrandr --listproviders. By doing that Provider 0, now corresponding to the NVIDIA, was correctly identified with the name NVIDIA-G0, and Provider 1 with modesetting. I reverted xorg.conf to the Prime/Optimus configuration again (i.e. intel as the active gpu). Running lsmod | grep -i nvidia indicates that nvidia drivers are loaded and nouveau drivers are blocked as expected due to modprobe.d configurations:
nvidia_uvm 1155072 0 nvidia_drm 65536 1 nvidia_modeset 1232896 2 nvidia_drm nvidia 34177024 19 nvidia_uvm,nvidia_modeset drm_kms_helper 266240 2 nvidia_drm,i915 drm 610304 15 drm_kms_helper,nvidia_drm,i915 i2c_core 94208 12 i2c_designware_platform,videodev,i2c_hid,i2c_designware_core,drm_kms_helper,i2c_algo_bit,nvidia,i2c_smbus,i2c_i801,i915,psmouse,drm
Plugging the HDMI cable was correctly identified (still no signal though), and according to xrandr --props, HDMI-1-1 source lists as Prime Syncronization enabled:
PRIME Synchronization: 1 supported: 0, 1 link-status: Good supported: Good, Bad CONNECTOR_ID: 74 supported: 74 non-desktop: 0 range: (0, 1)
I was curious of what was happening here so I tried going ahead and test hdmi output with the Reverse PRIME tutorial on Arch wiki since everything else was functioning as it should with Prime setup on laptop screen (despite not knowing why glxinfo and nvidia-settings failed), but as the name of the providers were wrong I decided use its reference number so I run xrandr --setprovideroutputsource 0 1, which resulted in a kinda similar error of that from glxinfo:
X Error of failed request: BadValue (integer parameter out of range for operation) Major opcode of failed request: 139 (RANDR) Minor opcode of failed request: 35 (RRSetProviderOutputSource) Value in failed request: 0xca Serial number of failed request: 16 Current serial number in output stream: 17
And tried to go ahead anyway trying to find a clue, xrandr --output HDMI-1-1 --auto --above eDP-1 (eDP-1 being the built-in screen of the laptop) resulted in:
xrandr: Configure crtc 3 failed
No apparent error on Xorg.0.log but something on dmesg during boot process, which seemingly didn't prevented the nvidia driver from working:
[drm:nv_drm_dumb_map_offset [nvidia_drm]] *ERROR* [nvidia-drm] [GPU ID 0x00000100] Failed to lookup gem object for mapping: 0x00000006 [drm:nv_drm_dumb_map_offset [nvidia_drm]] *ERROR* [nvidia-drm] [GPU ID 0x00000100] Failed to lookup gem object for mapping: 0x00000007
And a warning that NVIDIA taints the kernel:
nvidia: module license 'NVIDIA' taints kernel. Disabling lock debugging due to kernel taint
EDIT 1: After suspecting that Prime offloading with variables was only working on laptop native screen with command-line programs, with no graphical interface, I tried running some lutris game and indeed got the same error from glxinfo. So maybe this has to do with trying to connect to a display?
I have no idea where to go from here. Do you have any idea?
System information:
- Laptop Acer Nitro 5, i7 7th gen, NVIDIA 1050
- Kernel 5.10.19 patched with fsync from TKG, Slackware-current 64bits with alienbob multilib
- NVIDIA driver 460.56, CUDA release 11.2, V11.2.67 Build cuda_11.2.r11.2/compiler.29373293_0
- Xorg Server 1.20.10
- /etc/X11/xorg.conf (I'm still not used to .d snippets model): https://pastebin.com/0d13wDHY
- Had this 90-intel.conf on xorg.conf.d though, intended to force intel DRI3:
Section "OutputClass" Identifier "Force DRI 3" MatchDriver "modesetting" Option "DRI" "3" EndSection
- Xorg.0.log (removed keyboard, cam, bluetooth irrelevant lines): https://pastebin.com/s0Ebu2AY
- dmesg with some sensitive data redacted: https://pastebin.com/w0qWzxD9
- NVIDIA driver set to modeset (via conf file on modprobe.d
options nvidia-drm modeset=1)