NVidia Optimus works well under Linux when configured properly. Install the base Ubuntu system and complete the following steps to enable High Quality 3D graphics using the NVidia GPU:
Step 1: Add the Ubuntu Graphics PPA to APT Sources
Add the Ubuntu Graphics Drivers PPA to the system to gain access to the latest NVidia drivers:
sudo add-apt-repository ppa:graphics-drivers/ppa sudo apt-get update
Step 2: Install NVidia Driver and Bumblebee Switching Application
Install the bumblebee application and its constituent components and the desired NVidia proprietary driver to enable GPU switching. Accept any other required packages identified by the package management system.
sudo apt-get install bumblebee bumblebee-nvidia bbswitch-dkms nvidia-367 nvidia-367-dev libcuda1-367 nvidia-opencl-icd-367 nvidia-prime primus
Step 3: Edit Bumblebee Configuration Files
Modify the /etc/bumblebee/bumblebee.conf file so that it points to the correct NVidia driver. This will be the version of the NVidia proprietary driver you installed in the last step. At the time of writing the lastest version was nvidia-367. As newer versions are released this will not be the case and you will need to modify this file to reflect the installed version.
You must to change the specified driver, bridge and default library locations at the bottom of the file under the nvidia section. See example below taking note of the bolded areas.
# Configuration file for Bumblebee. Values should **not** be put between quotes ## Server options. Any change made in this section will need a server restart # to take effect. [bumblebeed] # The secondary Xorg server DISPLAY number VirtualDisplay=:8 # Should the unused Xorg server be kept running? Set this to true if waiting # for X to be ready is too long and don't need power management at all. KeepUnusedXServer=false # The name of the Bumbleblee server group name (GID name) ServerGroup=bumblebee # Card power state at exit. Set to false if the card shoud be ON when Bumblebee # server exits. TurnCardOffAtExit=false # The default behavior of '-f' option on optirun. If set to "true", '-f' will # be ignored. NoEcoModeOverride=false # The Driver used by Bumblebee server. If this value is not set (or empty), # auto-detection is performed. The available drivers are nvidia and nouveau # (See also the driver-specific sections below) Driver=nvidia # Directory with a dummy config file to pass as a -configdir to secondary X XorgConfDir=/etc/bumblebee/xorg.conf.d ## Client options. Will take effect on the next optirun executed. [optirun] # Acceleration/ rendering bridge, possible values are auto, virtualgl and # primus. Bridge=primus # The method used for VirtualGL to transport frames between X servers. # Possible values are proxy, jpeg, rgb, xv and yuv. VGLTransport=proxy # List of paths which are searched for the primus libGL.so.1 when using # the primus bridge PrimusLibraryPath=/usr/lib/x86_64-linux-gnu/primus:/usr/lib/i386-linux-gnu/primus # Should the program run under optirun even if Bumblebee server or nvidia card # is not available? AllowFallbackToIGC=false # Driver-specific settings are grouped under [driver-NAME]. The sections are # parsed if the Driver setting in [bumblebeed] is set to NAME (or if auto- # detection resolves to NAME). # PMMethod: method to use for saving power by disabling the nvidia card, valid # values are: auto - automatically detect which PM method to use # bbswitch - new in BB 3, recommended if available # switcheroo - vga_switcheroo method, use at your own risk # none - disable PM completely # https://github.com/Bumblebee-Project/Bumblebee/wiki/Comparison-of-PM-methods ## Section with nvidia driver specific options, only parsed if Driver=nvidia [driver-nvidia] # Module name to load, defaults to Driver if empty or unset KernelDriver=nvidia-367 PMMethod=auto # colon-separated path to the nvidia libraries LibraryPath=/usr/lib/nvidia-367:/usr/lib32/nvidia-367 # comma-separated path of the directory containing nvidia_drv.so and the # default Xorg modules path XorgModulePath=/usr/lib/nvidia-367/xorg,/usr/lib/xorg/modules XorgConfFile=/etc/bumblebee/xorg.conf.nvidia ## Section with nouveau driver specific options, only parsed if Driver=nouveau [driver-nouveau] KernelDriver=nouveau PMMethod=auto XorgConfFile=/etc/bumblebee/xorg.conf.nouveau
Step 4: Edit NVidia Configuration File to Prevent Screen Tearing
Modify the /etc/bumblebee/xorg.conf.nvidia file adding the following option to the end of the “Device” section. This will ensure that screen tearing is not present when using the NVidia GPU under Linux (Forces the NVidia driver to try to maintain a constant locked output at 60 FPS).
Option "RegistryDwords" "PerfLevelSrc=0x2222"
For reference here is my working /etc/bumblebee/xorg.conf.nvidia file:
Section "ServerLayout" Identifier "Layout0" Option "AutoAddDevices" "false" Option "AutoAddGPU" "false" EndSection Section "Device" Identifier "DiscreteNvidia" Driver "nvidia" VendorName "NVIDIA Corporation" # If the X server does not automatically detect your VGA device, # you can manually set it here. # To get the BusID prop, run `lspci | egrep 'VGA|3D'` and input the data # as you see in the commented example. # This Setting may be needed in some platforms with more than one # nvidia card, which may confuse the proprietary driver (e.g., # trying to take ownership of the wrong device). Also needed on Ubuntu 13.04. BusID "PCI:01:00:0" # Setting ProbeAllGpus to false prevents the new proprietary driver # instance spawned to try to control the integrated graphics card, # which is already being managed outside bumblebee. # This option doesn't hurt and it is required on platforms running # more than one nvidia graphics card with the proprietary driver. # (E.g. Macbook Pro pre-2010 with nVidia 9400M + 9600M GT). # If this option is not set, the new Xorg may blacken the screen and # render it unusable (unless you have some way to run killall Xorg). Option "ProbeAllGpus" "false" Option "NoLogo" "true" Option "UseEDID" "false" Option "UseDisplayDevice" "none" Option "RegistryDwords" "PerfLevelSrc=0x2222" EndSection
Step 5: Set the Intel GPU as the Primary system GPU
Set the Intel driver to be the default system GPU used to help save on battery life. If you want to use the NVidia GPU to render everything you can change the prime-select argument to nvidia, but this will have a significant negative impact on overall battery life.
sudo prime-select intel
Step 6: Install VirtualGL Package
Download and install the VirtualGL package to provide 3D application GPU testing prior to attempting to launch real applications using the NVidia GPU. The VirtualGL package can be obtained directly from their website:
Make sure to download the latest DEB version for your architecture. If you are using 64-bit Ubuntu you will want to download the amd64 package. Once the package has been downloaded install it using your preferred package installation method and symlink the /opt/VirtualGL/bin/glxspheres64 binary to /usr/bin/glxspheres so it can be run easily from the command line.
sudo ln -s /opt/VirtualGL/bin/glxspheres64 /usr/bin/glxspheres
Step 7: Reboot the System
Reboot the system.
Step 8: Run GLXSPHERES Using Intel and NVidia GPUs
Once logged in following the reboot, open a terminal and verify that the Intel and NVidia drivers are working properly. Use the optirun command as a prefix to launch applications with the NVidia GPU. Applications you want to run using the Intel GPU do not require the optirun prefix.
Run glxspheres using the Intel i915 driver:
xipher@RazrBlade:~$ glxspheres Polygons in scene: 62464 (61 spheres * 1024 polys/spheres) Visual ID of window: 0x94 Context is Direct OpenGL Renderer: Mesa DRI Intel(R) HD Graphics 530 (Skylake GT2) 60.188675 frames/sec - 67.170561 Mpixels/sec
Run glxspheres using the NVidia driver:
xipher@RazrBlade:~$ optirun glxspheres Polygons in scene: 62464 (61 spheres * 1024 polys/spheres) Visual ID of window: 0x94 Context is Direct OpenGL Renderer: GeForce GTX 970M/PCIe/SSE2 61.936621 frames/sec - 69.121269 Mpixels/sec
OPTIONAL: Configure Steam Games to Use NVidia GPU
Steam applications must be launched using a special pre-load command to use the NVidia GPU. When looking at your steam library, Right-Click the game you want to run with the NVidia GPU and select the ‘Properties’ option. On the window that opens, select ‘Set Launch Options…’.
Use the following launch option to tell Steam to run the game with the NVidia GPU:
LD_PRELOAD="libpthread.so.0 libGL.so.1" __GL_THREADED_OPTIMIZATIONS=1 optirun %command%
OPTIONAL: Configure NVidia GPU to Enable FXAA for all Applications
Many applications do not have an option to enable Anti-Aliasing within the applications settings. You can enable FXAA (Fast Approximate Anti-Aliasing, a low cost implementation of AA) for all applications in the nvidia-settings application. This will apply Anti-Aliasing at the driver level and provides it to all applications run using the NVidia GPU.
The nvidia-settings application must be launched using the optirun command so that the driver can access the NVidia GPU. By default bumblebee uses the 8th screen for the NVidia GPU, so use the -c :8 flag to specify which screen nvidia-settings should use.
xipher@RazrBlade:~$ optirun nvidia-settings -c :8