Gaming on Linux is better today than it ever has been, but that doesn’t mean it is perfect. NVIDIA’s drivers are notoriously finicky, especially when you want to use more advanced features like DLSS or Frame Generation.
After one too many times fighting with a configuration file, I decided to try a third-party alternative instead.
NVIDIA’s frame gen is hit-and-miss on Linux
It has gotten better, but it isn’t perfect
NVIDIA’s graphics drivers on Linux, which include Frame Generation and DLSS, have improved dramatically over the last few years. A lot of the credit goes to Valve for their work on Proton—without it, Frame Generation wouldn’t be possible on Linux at all.
However, despite the significant improvements, Frame Generation (and DLSS) on Linux is still unreliable. Sometimes, after an update to Proton or your NVIDIA drivers, the option to enable Frame Gen disappears completely. On a handful of occasions, I’ve had to use experimental versions of Proton or track down specific flags to enable Frame Gen at all.
Quiz
Gaming Tech and Render Pipelines
Trivia Challenge
From pixelated polygons to ray-traced masterpieces — how well do you know the tech powering PC gaming’s visual evolution?
EnginesHardwareGraphics APIsTechniquesSettings
Which graphics API did Microsoft introduce with Windows 95 to replace its earlier WinG library and become the dominant PC gaming standard through the early 2000s?
Correct! Direct3D, part of the DirectX suite, debuted in 1995 and quickly became the go-to API for PC game developers. Its tight integration with Windows hardware abstraction helped it dominate over competitors like 3dfx’s proprietary Glide API.
Not quite — the answer is Direct3D (DirectX). While OpenGL and Glide were real competitors in that era, Microsoft’s Direct3D won the platform war largely because it shipped with every copy of Windows and had strong OEM hardware support.
In real-time 3D rendering, what technique simulates the appearance of complex surface detail by manipulating how light interacts with a texture, without adding actual geometry?
Correct! Normal mapping encodes surface normal directions into a texture, tricking the lighting system into thinking a surface has bumps and grooves that don’t actually exist in the mesh. It became a cornerstone technique starting around the mid-2000s to add visual richness without a polygon budget cost.
Not quite — the answer is normal mapping. Tessellation actually does add geometry, mipmapping is about texture resolution at distance, and ambient occlusion shades crevices. Normal mapping is specifically the trick of faking surface detail through light interaction via encoded normal vectors.
Which game engine, originally developed for the 1998 shooter ‘Half-Life,’ was later updated to power ‘Half-Life 2’ in 2004 and became widely licensed by indie and mid-tier developers?
Correct! Valve’s GoldSrc engine powered the original Half-Life and was itself a heavily modified version of id Tech 2. Valve then built the Source engine for Half-Life 2, introducing advances like facial animation, physics via Havok, and HDR lighting — and licensed it to many developers throughout the 2000s.
Not quite — the answer is GoldSrc/Source. Id Tech 3 powered Quake III, Unreal Engine 2 was Epic’s competing product, and CryEngine 1 debuted with Far Cry in 2004. Valve’s Source engine stood out for its physics integration and expressive character rendering at the time.
NVIDIA’s GeForce 256, released in 1999, was marketed as the world’s first GPU. What key rendering task did it move from the CPU to dedicated on-card hardware for the first time?
Correct! Transform and Lighting — calculating how 3D vertices move and how light affects them — had previously been handled by the CPU. The GeForce 256 offloaded this to the GPU, dramatically freeing up the CPU and allowing far more complex lit scenes, which is why NVIDIA coined the term ‘GPU’ to distinguish it from earlier 3D accelerators.
Not quite — the answer is Transform and Lighting (T&L). Pixel shaders came later with DirectX 8 hardware, and shadow maps as a GPU-accelerated feature came later still. The GeForce 256’s defining innovation was specifically handling geometry transformation and per-vertex lighting calculations on-chip.
Ray tracing simulates realistic lighting by tracing paths of light rays. What was the primary reason real-time ray tracing was considered impractical for games before NVIDIA’s Turing (RTX 20-series) architecture in 2018?
Correct! Ray tracing requires casting potentially thousands of rays per pixel to calculate reflections, shadows, and global illumination, which is enormously expensive. Earlier GPUs had no dedicated hardware for this work, making it tens or hundreds of times too slow for real-time framerates. Turing introduced RT Cores specifically to accelerate ray-box and ray-triangle intersection tests.
Not quite — the core issue was raw computational cost on traditional shader hardware. While DirectX 12 Ultimate did formalize ray tracing support and engine integration took time, the fundamental bottleneck was always that without dedicated RT hardware, GPUs couldn’t trace enough rays per second to hit playable frame rates.
When you enable 16x Anisotropic Filtering in a PC game’s graphics settings, what specific visual problem does it primarily correct?
Correct! Without anisotropic filtering, textures on surfaces at oblique angles — like a road stretching into the distance — become blurry and washed out because standard bilinear or trilinear filtering samples the texture equally in both axes. Anisotropic filtering samples more heavily along the axis of the angle, preserving sharpness and detail dramatically.
Not quite — anisotropic filtering specifically fixes texture blurring on surfaces at steep viewing angles. Jagged polygon edges are addressed by anti-aliasing, z-fighting is a depth buffer precision issue, and LOD pop-in is managed by level-of-detail systems. AF is purely about keeping textures crisp when viewed at sharp angles.
Which rendering technique, popularized by games like ‘Crysis’ (2007) and later widely adopted, calculates how much ambient light reaches a surface point based on surrounding geometry to create realistic soft shadowing in crevices and corners?
Correct! SSAO, introduced to real-time rendering in Crysis, approximates ambient occlusion by analyzing the depth buffer in screen space to detect nearby geometry. It adds subtle darkening in corners, under objects, and in creases that makes scenes feel far more grounded and three-dimensional without the cost of full global illumination.
Not quite — the answer is Screen Space Ambient Occlusion (SSAO). SSR handles mirror-like reflections, subsurface scattering simulates light passing through skin and wax, and PCF is a shadow map softening technique. SSAO’s signature contribution is that soft contact shadowing in nooks and crannies that makes lighting feel physically believable.
Epic Games’ Unreal Engine 5, released in 2022, introduced two headline rendering technologies. ‘Lumen’ handles dynamic global illumination, but what is the name of the system that streams in near-infinite geometric detail using micropolygons instead of traditional LOD meshes?
Correct! Nanite is UE5’s virtualized geometry system that allows artists to import film-quality assets with millions of polygons, with the engine automatically culling and streaming only the triangles visible on screen at the required resolution. It effectively eliminates the need to hand-craft LOD levels for static meshes, a workflow that had been standard since the earliest 3D games.
Not quite — the answer is Nanite. Megascans is Quake’s photogrammetry asset library (also owned by Epic), Chaos is Unreal’s physics and destruction system, and MetaHuman is Epic’s digital human creator tool. Nanite is specifically the breakthrough virtualized micropolygon geometry renderer that made polygon budgets largely obsolete.
Your Score
/ 8
Thanks for playing!
Even when you can enable it, you’ll find a lot of complaints about wildly inconsistent frame rates, jittery or distorted interfaces, or performance far below what you’d get on Windows.
That problem is exacerbated by the fact that not all RTX cards can use every version of DLSS or Frame Gen. What works for someone with an RTX 2070 might be different from someone that is using an RTX 5070 Ti.
Those inconsistencies ultimately led me to look for something more reliable.
There is a $7 third-party alternative
lsfg-vk builds on Lossless Scaling
Lossless Scaling is a popular Windows application that brings frame generation and upscaling to almost any PC—no modern GPU with hardware support for AI features required. I use it on my laptop all the time, and it can often turn an unplayable game into a decent one.
There is only one major snag for Linux users: Lossless Scaling is only for Windows.
That is where lsfg-vk comes in. Lsfg-vk relies on the frame generation algorithm included with Lossless Scaling, but it hooks into the Vulkan API to add interpolated frames. That sounds limiting, since many games—especially older ones—rely on DirectX rather than Vulkan.
However, Proton includes two translation layers (DXVK and VKD3D) that automatically convert DirectX API calls into Vulkan calls. That means you can use frame generation with almost any Windows game on Linux, even games that don’t natively use Vulkan.
It isn’t just NVIDIA—it works with everything
AMD, Intel, and Integrated GPUs can all use it
Lsfg-vk shares another thing with Lossless Scaling: it runs on almost any modern hardware.
The only strict requirement is that the GPU needs to support Vulkan 1.3, but that is pretty easy to meet. Vulkan 1.3 has been around since 2022. If you have a GPU made in the last 10 years, it is very likely that lsfg-vk will work for you.
Besides that, lsfg-vk isn’t too picky about your GPU manufacturer—it’ll run on AMD, NVIDIA, and Intel without a problem. I use it on both my laptop (which has an AMD integrated GPU) and my desktop, which has an RTX 5070 Ti in it.
In general, AMD GPUs tend to get the biggest performance increase, since lsfg-vk has a specific option (allow_fp16) that AMD cards benefit from, while NVIDIA cards and Intel cards don’t.
lsfg-vk works on handhelds like the Steam Deck
Lsfg-vk also works on any x86-based handheld gaming platform, which includes Asus’s ROG Ally lineup, the Steam Deck, and Lenovo’s Legion Go. It has become so popular that there is a dedicated Decky plugin that makes installing and using lsfg-vk easier on Steam Decks.
Not every game is a great candidate for frame generation on a handheld, but it can dramatically improve your performance in some titles. If you’re struggling to get the performance you want, I’d certainly recommend that you try it.
Getting lsfg-vk working on Linux
Some configuration required
Lsfg-vk isn’t quite a one-click setup, but it is pretty straight-forward. First, you need to have Lossless Scaling on Steam. Lsfg-vk relies on some of Lossless Scaling’s assets to work.
Once you have Lossless Scaling installed, there are some assets you might need to download in advance. I’m using Kubuntu, which is based on Ubuntu, so I ran:
sudo apt install qt6-qpa-plugins libqt6quick6 qml6-module-qtquick-controls qml6-module-qtquick-layouts qml6-module-qtquick-window qml6-module-qtquick-dialogs qml6-module-qtqml-workerscript qml6-module-qtquick-templates qml6-module-qt-labs-folderlistmodel
If you’re running a distro based on Arch or Fedora (like Bazzite), the commands you need to use will be different.
With those installed, all you need to do is download the latest stable version of lsfg-vk from GitHub and run the installer using the following command:
sudo apt install ./lsfg-vk-1.0.0.x86_64.deb
If you’re using Fedora or Arch, you’ll need to use DNF or Pacman respectively.
Then you’ll be able to launch lsfg-vk from whichever application launcher your distro uses. I’d recommend creating a profile for each game you’re going to be playing, since not every game is an ideal candidate for frame generation.
The developer has provided a comprehensive explanation of what each profile setting does and how it will affect your gameplay.
I’d recommend reading it if you run into trouble.
Lsfg-vk is a great feature for Linux gamers
Gaming on Linux still isn’t perfect, but software like lsfg-vk and Proton are rapidly closing the gap with Windows. Some games now even run *better* on Linux—something that was unthinkable 10 years ago.
For everything that doesn’t, lsfg-vk is a great way to eke out some extra frames in the meantime.
