Fedora on notebooks in 2022 -- how good is it?

After disabling secure boot, Nvidia driver now loads – yay!

[    0.000000] secureboot: Secure boot disabled
[    0.005067] secureboot: Secure boot disabled
[    0.667067] integrity: Loaded X.509 cert 'Fedora Secure Boot CA: fde32599c2d61db1bf5807335d7b20e4cd963b42'
[    2.052462] amdgpu 0000:05:00.0: amdgpu: SECUREDISPLAY: securedisplay ta ucode is not available
[    3.159260] Bluetooth: hci0: Secure boot is enabled
[    3.263591] input: HDA NVidia HDMI/DP,pcm=3 as /devices/pci0000:00/0000:00:01.1/0000:01:00.1/sound/card0/input17
[    3.263676] input: HDA NVidia HDMI/DP,pcm=7 as /devices/pci0000:00/0000:00:01.1/0000:01:00.1/sound/card0/input18
[    3.263747] input: HDA NVidia HDMI/DP,pcm=8 as /devices/pci0000:00/0000:00:01.1/0000:01:00.1/sound/card0/input19
[    3.263804] input: HDA NVidia HDMI/DP,pcm=9 as /devices/pci0000:00/0000:00:01.1/0000:01:00.1/sound/card0/input20
[    3.635464] nvidia: loading out-of-tree module taints kernel.
[    3.635475] nvidia: module license 'NVIDIA' taints kernel.
[    3.658503] nvidia: module verification failed: signature and/or required key missing - tainting kernel
[    3.672419] nvidia-nvlink: Nvlink Core is being initialized, major device number 507
[    3.673141] nvidia 0000:01:00.0: enabling device (0400 -> 0403)
[    3.673228] nvidia 0000:01:00.0: vgaarb: changed VGA decodes: olddecodes=io+mem,decodes=none:owns=none
[    3.722391] NVRM: loading NVIDIA UNIX x86_64 Kernel Module  510.54  Tue Feb  8 04:42:21 UTC 2022
[    3.801422] nvidia_uvm: module uses symbols from proprietary module nvidia, inheriting taint.
[    3.810868] nvidia-uvm: Loaded the UVM driver, major device number 505.
[    3.835572] nvidia-modeset: Loading NVIDIA Kernel Mode Setting Driver for UNIX platforms  510.54  Tue Feb  8 04:34:06 UTC 2022
[    3.838966] [drm] [nvidia-drm] [GPU ID 0x00000100] Loading driver
[    4.530628] [drm] Initialized nvidia-drm 0.0.0 20160202 for 0000:01:00.0 on minor 1

However, it is runnig X11, not Wayland. Is there any way to get the best of both worlds?

Nice!!

1 Like

In the past year, I’ve installed Fedora on 4 new laptops, two for work and two for home. They were all newer Ryzen family processors with Radeon graphics, for what it’s worth, and all worked perfectly out of box. One of the personal one’s is my wife’s laptop.

My ThinkPad P14 has a touchscreen and excellent battery life. My MSI Alpha 15 is an absolute compute and gaming powerhouse. My wife’s HP is a great blend of personal computer with a machine capable of doing all she needs to do for graphic design and photo editing and she generally uses a Wacom for a mouse.

There are plenty of great options for a great out of box experience running Fedora on a notebook.

2 Likes

Thanks for the feedback @vwbusguy ! Here the initial experience was as you described: everything worked just fine out of the box, as far as “standalone” use was concerned. However, using a secondary monitor (a must for me) did not work right away, but I am making progress with the invaluable help of the awesome Fedora community :muscle:t3: Just curious: have you tested this setup (secondary monitor) in any of your configurations? If so, did it just work? Or did you have to do any special tweaking?

Now that you have nvidia working, the next step is to allow nvidia access to both screens. This will also help with getting wayland to work, but the only experience I have is with the rpmfusion drivers and not negativo17 drivers.

That is easily accomplished by copying an nvidia.conf file which is installed by the xorg-x11-drv-nvidia package when you install the nvidia drivers from rpmfusion.
That file /usr/share/X11/xorg.conf.d/nvidia.conf is copied to /etc/X11/xorg.conf.d/nvidia.conf which then allows the nvidia GPU access to both the external and internal screens. Without it wayland has problems with dual screens and for some may not even allow both screens to function.

As far as having wayland work, when logging in, the screen where you enter your password should have a gear icon in the lower right corner. Clicking on that icon allows you to select wayland or xorg. I know that works with gnome, but don’t know about other DE environments.

Thanks for the tips, @computersavvy ! I did as you suggested (with RPMFusion drivers), but if select “GNOME on Wayland” before login, it always falls back to AMD Renoir driver. The only way I can enable Nvidia by default is to manually choose just “GNOME” (X11). It surely allows me to work as I want to (dual-screen), but I must say I was hoping I would be able to do it on Wayland. Did you actually manage to configure dual-screen with Nvidia + Wayland?

Is there a compelling reason to use wayland? (I’m not complaining but depending on what you need to do it may be even better not to use wayland and stay with X11) because for the answers of this post NVIDIA drivers and Wayland if you use the proprietary driver of nvidia you have to use X11 and on the Howto/NVIDIA - RPM Fusion says that you lose the VDPAU hardware acceleration (I don’t know how important is this, so I’m sorry if this means nothing to you).

Wayland

NVIDIA works under Wayland (and Xwayland) starting with Fedora 35 and NVIDIA driver 495 and later. With GNOME 41, Wayland can be selected explicitly with GDM.

Please remind that video acceleration with VDPAU isn’t available under Wayland.

Other question maybe I’m wrong but since you have two graphics cards you shouldn’t have to follow the Howto/Optimus - RPM Fusion they have a section for ‘External Monitors detection’

1 Like

There is one other way to use nvidia only, though I don’t know the affects with wayland since I use xorg only. If you want nvidia full time, it can be set as the primary gpu and will disable the amd IGP for you . The only drawback I am aware of is a slightly higher power draw full time and that may be a negative if you are using the laptop on battery for extended periods. If it is mostly stationary and on AC like mine it is really not an issue.

To do that edit the file you copied (/etc/X11/xorg.conf.d/nvidia.conf) and add this line into both stanzas of that file. Option "Primary" "yes"

With the latest version of wayland and the nvidia driver wayland does work on nvidia, though it is still not even close to 100% perfect and some apps have issues. As suggested above, unless it is critical that you use wayland it seems better to stick with xorg for some time while wayland continues to improve in its compatibility with nvidia.

1 Like

Yes. I generally dock my work laptop, the Lenovo P14, with the Lenovo dock, both a first generation ThinkPad dock (at home) and a second generation one at work. That works fine. I did have an issue where the second display wouldn’t show up on boot until I unplugged and replugged in the USB-C cable for the dock and after that point, it worked as expected, however, that seems to have since been patched by a Firmware update from Lenovo. I didn’t have any issues docking the MSI or HP with an external monitor.

I should also specify that I’m using Wayland on all of these laptops.

It’s the future …
And X is so broken

Hi @lewatoto ! No, not really :smirk: It does feel more fluid to me (less tearing and such), and I like the idea of using the next generation, but I am aware it’s quite not there yet. Bottom line is, until it matures, I know somethings I will only be able to accomplish with X11. In my case, whenever I have to use a second screen, I will use X11; for standalone use, I will probably use Wayland.

Thanks for the tip on the “external monitors detection” section at RPMFusion, I had already enabled the “PrimaryGPU” setting – but IIRC the external monitor was also detected without it.

@vwbusguy so you did manage to have Wayland work with external monitors? Awesome! Can you please provide some more details? (if you use the Nvidia driver, configuration options, what’s your primary GPU etc.) If I could use two monitors with my primary GPU (AMD Renoir) alone, that would be more than enough for me.

@jakfrost That’s kind of how I feel as well. But I know that X11 is still the go-to solution depending on what you need to do, I guess we’ll have to live with it for a little longer.

Off-topic
Its just my opinion, don’t want to start a debate or something like that.

I don’t deny that x could be broken sometimes, as normal user I never “perceived the difference” maybe I had the luck to avoid most of the problems of x11. Wayland works but lack of things like a proper color profile manager (if i remember well, on 2018 someone started the development of something like this but isn’t finished yet) and the problems with share screen on video calls apps, but this even fails on the fedora’s version of firefox on X11.

And I already know that if I don’t like something I could try to help to develop the solution or improve the actual implementation but my coding skills are so awful, so I’ll wait that someone on the companies/volunteers that contribute the development of wayland have the same o similar issues and start to develop a solution, meanwhile I’ll stay with x11.

If I remember well you can prevent the screen tearing changing something on Xorg settings How to save the X server settings? - #2 by generix - Linux - NVIDIA Developer Forums, the solution that I used when had Fedora on my desktop pc was:

Open Terminal and type

nvidia-settings

The nvidia x server settings window will open, click on “X Server Display Configuration”

Then click on “Advanced” and click on “Force Full Composition Pipeline”. Apply the setting and click OK.

Open the “PowerMizer” option to the left and select the drop down menu for “Prefer Maximum Performance”

But don’t know if this still works since now my main computer is a cheap dell with a ryzen 5.

@andre.ocosta - I didn’t have to do anything but plug them in. The GPUs are all some flavor of newer Radeon. My MSI is an RX5600 and the others are some recent flavor of Renoir. This shouldn’t matter between X11 and wayland, but figuring out the resolution and display limitations over USB-C/Thunderbolt is a real pain at a hardware level. That’s not an operating system issue as much as DisplayPort and Thunderbolt being an absolute mess. I suggest simplifying things and try connecting directly to an external monitor over HDMI or VGA if you can and if that works, start debugging what does or doesn’t work with your docking or DisplayPort hardware.

@lewatoto yours is a perfectly valid opinion, and you are providing valuable feedback. Thanks for the tips regarding Nvidia tweaking, will definitely try them out.

@vwbusguy I was not that lucky… if I plug the external monitor to the Nitro 5 HDMI port while using Wayland + Renoir GPU, nothing happens :sob: I’ll try to debug it later, perhaps on a dedicated thread.

@vwbusguy I just found out on Acer community forum that the HDMI port is directly wired to the GTX 1650. So, there is no way it will ever work only with the AMD GPU… :slightly_frowning_face:

Ah, that will do it! I had a Dell once that did this. It had an Intel and an nVidia GPU and it was hard-wired to the Intel GPU when docked. You should still be able to use the AMD GPU for individual apps in this case by launching them with DRI_PRIME=1. If you’re using Gnome, you should have an option when you right click an app to “Launch with Discrete Graphics”. If the nVidia card is the discrete graphics, then this won’t be the case - normally the integrated graphics are hard-wired in order to save power and only use the Discrete GPU on demand, but it sounds like yours may be the other way around?

Yeah, I don’t know why they did it this way :thinking: I will do some more testing with different scenarios (AMD only, AMD + Nouveau, AMD + Nvidia, Nvidia only, all of them on X11 and on Wayland) to see what comes out of it, maybe I get some valuable info. The important thing is that, with Nvidia + X11, I can use the secondary display, even if PrimaryGPU is set to false :raised_hands:t3: Until the combo Wayland + Nvidia matures enough, that will have to do it.

1 Like

Well, I finally made some tests. I don’t know if this makes any sense, but what I did was to run xrandr -q on all the configurations above (is it actually expected to produce any useful output on Wayland? :thinking: ). To my surprise, when I ran it on Wayland with the Nvidia proprietary driver, it did seem to detect the external monitor, even though it apparently failed to activate it:

Screen 0: minimum 16 x 16, current 3840 x 1080, maximum 32767 x 32767
XWAYLAND0 connected 1920x1080+1920+0 (normal left inverted right x axis y axis) 480mm x 270mm
   1920x1080     59.96*+
   1440x1080     59.99  
   1400x1050     59.98  
   1280x1024     59.89  
   1280x960      59.94  
   1152x864      59.96  
   1024x768      59.92  
   800x600       59.86  
   640x480       59.38  
   320x240       59.52  
   1680x1050     59.95  
   1440x900      59.89  
   1280x800      59.81  
   720x480       59.71  
   640x400       59.95  
   320x200       58.96  
   1600x900      59.95  
   1368x768      59.88  
   1280x720      59.86  
   1024x576      59.90  
   864x486       59.92  
   720x400       59.55  
   640x350       59.77  
XWAYLAND1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 340mm x 190mm
   1920x1080    144.04*+
   1440x1080    144.01  
   1400x1050    144.00  
   1280x1024    144.05  
   1280x960     144.13  
   1152x864     144.09  
   1024x768     143.87  
   800x600      143.83  
   640x480      143.85  
   320x240      142.05  
   1680x1050    144.07  
   1440x900     143.99  
   1280x800     144.00  
   720x480      143.85  
   640x400      144.04  
   320x200      141.40  
   1600x900     144.04  
   1368x768     143.93  
   1280x720     143.85  
   1024x576     143.91  
   864x486      143.63  
   720x400      143.88  
   640x350      143.57

The XWAYLAND1 primary is indeed the primary display. The XWAYLAND0 is the external monitor, which seems to be positioned to the right of the primary one.

So, based on the output above, do you guys believe there is any chance of using the external monitor with Wayland?

Just as a baseline, here’s the output of xrandr -q with Nvidia + X11 (the external monitor is positioned above the primary one):

Screen 0: minimum 320 x 200, current 1920 x 2160, maximum 16384 x 16384
eDP connected primary 1920x1080+0+1080 (normal left inverted right x axis y axis) 344mm x 193mm
   1920x1080    144.15*+  60.20  
   1680x1050    144.15  
   1280x1024    144.15  
   1440x900     144.15  
   1280x800     144.15  
   1280x720     144.15  
   1024x768     144.15  
   800x600      144.15  
   640x480      144.15  
HDMI-1-0 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 477mm x 268mm
   1920x1080     60.00*+  59.94    50.00  
   1680x1050     59.95  
   1280x1024     60.02  
   1280x960      60.00  
   1280x720      60.00    59.94    50.00  
   1152x864      60.00  
   1024x768      60.00  
   800x600       60.32  
   720x576       50.00  
   720x480       59.94  
   640x480       59.94    59.93

If it’s detected by xrandr then it should be usable. It should show up in your desktop environment’s display config as well.

I was looking around for monitors configuration on GNOME, and found out about monitors.xml. Oddly enough, there is no documentation on how it works exactly :slightly_frowning_face: Anyway, the format seems intuitive enough, so that shouldn’t be much of a problem.

My ~/.config/monitors.xml contains similar entries for the same setup:

<monitors version="2">
  <configuration>
    <logicalmonitor>
      <x>0</x>
      <y>0</y>
      <scale>1</scale>
      <primary>yes</primary>
      <monitor>
        <monitorspec>
          <connector>eDP-1</connector>
          <vendor>AUO</vendor>
          <product>0xaf90</product>
          <serial>0x00000000</serial>
        </monitorspec>
        <mode>
          <width>1920</width>
          <height>1080</height>
          <rate>144.14927673339844</rate>
        </mode>
      </monitor>
    </logicalmonitor>
    <disabled>
      <monitorspec>
        <connector>HDMI-1</connector>
        <vendor>GSM</vendor>
        <product>E2250</product>
        <serial>0x01010101</serial>
      </monitorspec>
    </disabled>
  </configuration>
  <configuration>
    <logicalmonitor>
      <x>0</x>
      <y>0</y>
      <scale>1</scale>
      <monitor>
        <monitorspec>
          <connector>HDMI-1-0</connector>
          <vendor>GSM</vendor>
          <product>E2250</product>
          <serial>0x01010101</serial>
        </monitorspec>
        <mode>
          <width>1920</width>
          <height>1080</height>
          <rate>60</rate>
        </mode>
      </monitor>
    </logicalmonitor>
    <logicalmonitor>
      <x>0</x>
      <y>1080</y>
      <scale>1</scale>
      <primary>yes</primary>
      <monitor>
        <monitorspec>
          <connector>eDP</connector>
          <vendor>AUO</vendor>
          <product>0xaf90</product>
          <serial>0x00000000</serial>
        </monitorspec>
        <mode>
          <width>1920</width>
          <height>1080</height>
          <rate>144.14927673339844</rate>
        </mode>
      </monitor>
    </logicalmonitor>
  </configuration>
  <configuration>
    <logicalmonitor>
      <x>0</x>
      <y>1080</y>
      <scale>1</scale>
      <primary>yes</primary>
      <monitor>
        <monitorspec>
          <connector>eDP-1-0</connector>
          <vendor>AUO</vendor>
          <product>0xaf90</product>
          <serial>0x00000000</serial>
        </monitorspec>
        <mode>
          <width>1920</width>
          <height>1080</height>
          <rate>144.14927673339844</rate>
        </mode>
      </monitor>
    </logicalmonitor>
    <logicalmonitor>
      <x>0</x>
      <y>0</y>
      <scale>1</scale>
      <monitor>
        <monitorspec>
          <connector>HDMI-0</connector>
          <vendor>GSM</vendor>
          <product>E2250</product>
          <serial>0x01010101</serial>
        </monitorspec>
        <mode>
          <width>1920</width>
          <height>1080</height>
          <rate>60</rate>
        </mode>
      </monitor>
    </logicalmonitor>
  </configuration>
</monitors>

Notice how the external monitor (a good old LG E2250) is sometimes identified as connected to HDMI-1, sometimes to HDMI-1-0, and sometimes HDMI-0. Also, notice how only the last configuration is correctly positioned (primary monitor is below the secondary one).

So, a couple of questions:

  1. How do I know the correct HDMI connector id GNOME+Wayland is seeing?
  2. In case there are multiple <configuration> options, which one is used? Does GNOME automatically chooses the right one? Or can it get confused?