Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vsync forced on glx backend #1203

Open
Shringe opened this issue Feb 17, 2024 · 0 comments
Open

Vsync forced on glx backend #1203

Shringe opened this issue Feb 17, 2024 · 0 comments

Comments

@Shringe
Copy link

Shringe commented Feb 17, 2024

Platform

Arch Linux x86 NVIDIA

GPU, drivers, and screen setup

direct rendering: Yes
Memory info (GL_NVX_gpu_memory_info):
Dedicated video memory: 8192 MB
Total available memory: 8192 MB
Currently available dedicated video memory: 7460 MB
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: NVIDIA GeForce RTX 3070/PCIe/SSE2
OpenGL core profile version string: 4.6.0 NVIDIA 545.29.06
OpenGL core profile shading language version string: 4.60 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile

OpenGL version string: 4.6.0 NVIDIA 545.29.06
OpenGL shading language version string: 4.60 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)

OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 545.29.06
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

Environment

qtile and i3 (tested)

picom version

vgit-fc1d1

Diagnostics
**Version:** vgit-fc1d1

Extensions:

  • Shape: Yes
  • RandR: Yes
  • Present: Present

Misc:

  • Use Overlay: Yes
  • Config file used: /etc/xdg/picom.conf

Drivers (inaccurate):

NVIDIA

Backend: glx

  • Driver vendors:
  • GLX: NVIDIA Corporation
  • GL: NVIDIA Corporation
  • GL renderer: NVIDIA GeForce RTX 3070/PCIe/SSE2

Backend: egl

  • Driver vendors:
  • EGL: NVIDIA
  • GL: NVIDIA Corporation
  • GL renderer: NVIDIA GeForce RTX 3070/PCIe/SSE2

Steps of reproduction

  1. Use Nvidia(?)
  2. disable vsync
  3. enable glx backend

Expected behavior

vsync should be disabled like xrender

Current Behavior

vysnc stays enabled regardless

Other details

Picom and any fork of it force vsync with the glx backend. This is problematic because in X11, multi-monitor mixed refresh-rate setups always vsync to the lowest common refresh-rate. The only way to get the proper refresh rate of mixed refresh-rate setups in X11 is to disable vsync.

This may or may not be related but I also can not get dual_kawase blur to work, and glx CPU usage isn't much lower than xrender despite being GPU accelerated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant