Skip to main content

Questions tagged [gpu]

GPU (graphics processing unit), is a specialized processor designed to accelerate the process of building images.

Filter by
Sorted by
Tagged with
0 votes
2 answers
116 views

I'm a beginner in game programming and I want someone to confirm my understanding. Let's say there's a 3D model for a character in the game. This character is made up of triangles. Each vertex in ...
marcelo kmsaw's user avatar
0 votes
2 answers
311 views

I'm using OpenGL but this question should apply generally to rendering. I understand that for efficient rendering in games, you want to minimize communication between the CPU and GPU. This means pre-...
greenlagoon's user avatar
1 vote
0 answers
867 views

I have a simple (and maybe quite naive) question regarding dual GPU use for Virtual Reality (VR); it nagged me for years now and I couldn't figure out why this can't work, at least in principle: I ...
Felix Tritschler's user avatar
0 votes
1 answer
2k views

My game need to loop through massive amount of data, and the amount of data can increase by a lot depending on world settings set by player. The data is too big for CPU so i need to use GPU for it ...
pi squared's user avatar
1 vote
1 answer
408 views

I've made a fluid simulation using particles in Unity, but now it is painfully slow because all computations are done using the CPU. In order to make it faster, I need to do computations on the GPU, ...
UserUser's user avatar
  • 171
0 votes
0 answers
1k views

I have the following C# code in Unity version 2022.2.0a12: ...
Corvus Ultima's user avatar
0 votes
0 answers
235 views

I built a small game engine using OpenCL and SDL2 but it's not running really fast compare to Vulkan and OpenGL. I wrote rasterization code, but when I did some research Vulkan and OpenGL use hardware ...
is code's user avatar
  • 31
1 vote
1 answer
754 views

I'm trying to render a bunch of clouds via gpu instancing. This works perfectly fine with the default HDRP Lit Shader, and I get the following result: However, as soon as I change the surface type ...
person132's user avatar
0 votes
1 answer
324 views

I'm learning godot with a laptop that has AMD discrete GPU. My OS is Arch Linux so if I want to use discrete GPU I have to set system environment variable ...
ArchBug's user avatar
  • 21
0 votes
0 answers
39 views

When rendering inside a FBO's texture, I'm not using glClear() but overwriting each fragment, GL_BLEND is set to true. This works just fine, but I just realised when my laptop switch to economy mode, ...
ebkgne's user avatar
  • 21
0 votes
0 answers
2k views

I am attempting to make a 3d game using SDL2 just to learn and have a bit of fun. I was wondering if there is any way to get SDL2 do calculations on GPU. I have read that SDL2 Textures uses GPU for ...
Apple_Banana's user avatar
0 votes
1 answer
282 views

I was doing research, trying to answer the question "Which was the first GPU to support 24-bit color". I know all color since 1992 is 24-bit, even in games like Doom. I mean 16 million ...
Boris Rusev's user avatar
0 votes
0 answers
68 views

Does each core’s data get written to the shared memory one at a time or both at the same time? For instance, when two cores that are next to each other need to write to the same memory spot does the ...
user11937382's user avatar
0 votes
1 answer
240 views

I have implemented a two pass Gaussian blur shader in GLSL like this: ...
racz16's user avatar
  • 171
3 votes
1 answer
1k views

I would like to render a grid with many dots, all equally spaced and being exactly 1 pixel wide. What I was able to achieve so far is this : What I would like is (simulated using image editing ...
tigrou's user avatar
  • 3,279
1 vote
1 answer
346 views

This is more a theoretical question. This is what I understand regarding buffer swapping and vsync: I - When vsync is off, whenever the developer swap the front/back buffers, the buffer that the GPU ...
felipeek's user avatar
  • 131
3 votes
2 answers
2k views

I'm interested in game development and 3D graphics; however, I'm not very experienced, so I apologize in advance if this comes across as ignorant or overly general. The impression I get is that quite ...
Time4Tea's user avatar
  • 133
4 votes
3 answers
4k views

I'm currently learning OpenGL and it's become obvious that I can do most calculations on the CPU or the GPU. For example, I can do something like lightColor * objectColor on the CPU, then send it to ...
Adam's user avatar
  • 86
1 vote
1 answer
597 views

Hi^^ i am trying to read data in resource, which i used to do well without any problems. but suddenly it is not working. first i made immutable resource that has data in it, which is ...
KIM CHANGJUN's user avatar
1 vote
0 answers
337 views

I'm reading this example of ffmpeg hardware decoding: https://github.com/FFmpeg/FFmpeg/blob/release/4.2/doc/examples/hw_decode.c At line 109 it does this: ...
Poperton's user avatar
  • 111
0 votes
1 answer
284 views

I'm working on a deferred shading renderer in OpenGL, where I write all geometry output to Colour, Normal and Depth textures and then apply lighting effects later. Everything seems fine except once ...
Vercidium's user avatar
1 vote
1 answer
181 views

i create a simple 2d scene in unity 2017.3.1f1 I changed the size (height and width) in the Game View and proflie to see how it affects the rendering.. (below photo) I saw that the rendering time ...
sam's user avatar
  • 9
6 votes
0 answers
92 views

I'm testing on adding some OpenCL code in my game, but I only have a single Nvidia card & I'm not sure the code will run normally on other platforms. Is there any way to make sure my code runs ...
ravenisadesk's user avatar
1 vote
1 answer
663 views

I've been learning Vulkan lately and I read that you can allocate VRAM memory only set amount of times and it doesn't matter if it's 2gb or 2kb, why is it? I'm specifically referring to ...
Werem's user avatar
  • 148
1 vote
1 answer
692 views

Texture splatting - usually - is done by vertex painting, where each channel R-G-B-A is assigned as a different texture weight. Due to the way shaders are executed, doesn't it mean that the fragment ...
JBeurer's user avatar
  • 467