GPU Conceptual Overview
?
?

Keyboard Navigation

Global Keys

[, < / ], > Jump to previous / next episode
W, K, P / S, J, N Jump to previous / next timestamp
t / T Toggle theatre / SUPERtheatre mode
V Revert filter to original state Y Select link (requires manual Ctrl-c)

Menu toggling

q Quotes r References f Filter y Link c Credits

In-Menu and Index Controls

a
w
s
d
h j k l


Esc Close menu / unfocus timestamp

Quotes and References Menus and Index

Enter Jump to timestamp

Quotes, References and Credits Menus

o Open URL (in new tab)

Filter Menu

x, Space Toggle category and focus next
X, ShiftSpace Toggle category and focus previous
v Invert topics / media as per focus

Filter and Link Menus

z Toggle filter / linking mode

Credits Menu

Enter Open URL (in new tab)
0:07Recap and glimpse into the future of streaming via a video capture card
0:07Recap and glimpse into the future of streaming via a video capture card
0:07Recap and glimpse into the future of streaming via a video capture card
2:06Run the game and note that Casey can see the game and we can't
2:06Run the game and note that Casey can see the game and we can't
2:06Run the game and note that Casey can see the game and we can't
3:00Recap where we're at
3:00Recap where we're at
3:00Recap where we're at
4:31Blackboard: GPUs
4:31Blackboard: GPUs
4:31Blackboard: GPUs
7:00Blackboard: How the CPU relates to the GPU
7:00Blackboard: How the CPU relates to the GPU
7:00Blackboard: How the CPU relates to the GPU
10:59Blackboard: A typical mobile setup
10:59Blackboard: A typical mobile setup
10:59Blackboard: A typical mobile setup
13:19Blackboard: The implications of the system RAM and graphics RAM being separated by a PCI bus, i.e. latency
13:19Blackboard: The implications of the system RAM and graphics RAM being separated by a PCI bus, i.e. latency
13:19Blackboard: The implications of the system RAM and graphics RAM being separated by a PCI bus, i.e. latency
15:39Blackboard: CPU vs GPU, historically
15:39Blackboard: CPU vs GPU, historically
15:39Blackboard: CPU vs GPU, historically
22:42Blackboard: A high level overview of GPU architecture
22:42Blackboard: A high level overview of GPU architecture
22:42Blackboard: A high level overview of GPU architecture
29:03Blackboard: How a GPU core works
29:03Blackboard: How a GPU core works
29:03Blackboard: How a GPU core works
31:38Blackboard: "Shader" != "CPU code"
31:38Blackboard: "Shader" != "CPU code"
31:38Blackboard: "Shader" != "CPU code"
33:05Blackboard: How if statments are executed on a GPU
33:05Blackboard: How if statments are executed on a GPU
33:05Blackboard: How if statments are executed on a GPU
36:46Blackboard: How loops are worked through on a GPU
36:46Blackboard: How loops are worked through on a GPU
36:46Blackboard: How loops are worked through on a GPU
37:51Blackboard: "Warp"α
37:51Blackboard: "Warp"α
37:51Blackboard: "Warp"α
38:28Blackboard: Summarise what a GPU is
38:28Blackboard: Summarise what a GPU is
38:28Blackboard: Summarise what a GPU is
41:16Blackboard: How we program for the GPU
41:16Blackboard: How we program for the GPU
41:16Blackboard: How we program for the GPU
46:14Blackboard: "Pushbuffer"
46:14Blackboard: "Pushbuffer"
46:14Blackboard: "Pushbuffer"
53:26Blackboard: Plan for tomorrow
53:26Blackboard: Plan for tomorrow
53:26Blackboard: Plan for tomorrow
56:38Q&A
🗩
56:38Q&A
🗩
56:38Q&A
🗩
57:04ratchetfreak You can instead of two triangles use a single triangle twice as big. It avoids the overdraw at the diagonal
🗪
57:04ratchetfreak You can instead of two triangles use a single triangle twice as big. It avoids the overdraw at the diagonal
🗪
57:04ratchetfreak You can instead of two triangles use a single triangle twice as big. It avoids the overdraw at the diagonal
🗪
58:33kknewkles Quick CPU question: All it does on barebones physics level is run electrons through if-statements, right? (Go left, go right, etc., with transistors)
🗪
58:33kknewkles Quick CPU question: All it does on barebones physics level is run electrons through if-statements, right? (Go left, go right, etc., with transistors)
🗪
58:33kknewkles Quick CPU question: All it does on barebones physics level is run electrons through if-statements, right? (Go left, go right, etc., with transistors)
🗪
59:34NoRaD91 Can't you run a Xeon Phi as your main processor, or did they make them dedicated only? Not like price / performance makes sense there
🗪
59:34NoRaD91 Can't you run a Xeon Phi as your main processor, or did they make them dedicated only? Not like price / performance makes sense there
🗪
59:34NoRaD91 Can't you run a Xeon Phi as your main processor, or did they make them dedicated only? Not like price / performance makes sense there
🗪
1:00:15ChronalDragon What barriers are there currently from just using the CPU as a GPU?
🗪
1:00:15ChronalDragon What barriers are there currently from just using the CPU as a GPU?
🗪
1:00:15ChronalDragon What barriers are there currently from just using the CPU as a GPU?
🗪
1:01:49aceflameseer Can we make a "first person" 3D mode of the game, just for education?
🗪
1:01:49aceflameseer Can we make a "first person" 3D mode of the game, just for education?
🗪
1:01:49aceflameseer Can we make a "first person" 3D mode of the game, just for education?
🗪
1:01:55cubercaleb Where does the PS4 / XBone processor lie on the CPU / GPU spectrum?
🗪
1:01:55cubercaleb Where does the PS4 / XBone processor lie on the CPU / GPU spectrum?
🗪
1:01:55cubercaleb Where does the PS4 / XBone processor lie on the CPU / GPU spectrum?
🗪
1:04:18quartertron Intel has killed off many projects that made good money but had bad margins
🗪
1:04:18quartertron Intel has killed off many projects that made good money but had bad margins
🗪
1:04:18quartertron Intel has killed off many projects that made good money but had bad margins
🗪
1:05:56Longboolean I've been told that lots of graphics drivers optimize for specific games (at the driver level). How does this fit into the equation? How do those optimizations make some games run better?
🗪
1:05:56Longboolean I've been told that lots of graphics drivers optimize for specific games (at the driver level). How does this fit into the equation? How do those optimizations make some games run better?
🗪
1:05:56Longboolean I've been told that lots of graphics drivers optimize for specific games (at the driver level). How does this fit into the equation? How do those optimizations make some games run better?
🗪
1:08:05sssmcgrath Why does everyone good that works at Intel hate Intel, yet simultaneously Intel's engineering is so far ahead of everyone else's? It doesn't compute!
🗪
1:08:05sssmcgrath Why does everyone good that works at Intel hate Intel, yet simultaneously Intel's engineering is so far ahead of everyone else's? It doesn't compute!
🗪
1:08:05sssmcgrath Why does everyone good that works at Intel hate Intel, yet simultaneously Intel's engineering is so far ahead of everyone else's? It doesn't compute!
🗪
1:08:36garryjohanson What did you think about Larrabee?
🗪
1:08:36garryjohanson What did you think about Larrabee?
🗪
1:08:36garryjohanson What did you think about Larrabee?
🗪
1:08:56chr0n0kun Does Vulkan fix the problem with OpenGL of not being able to transfer buffer-objects between processes with separate address spaces?
🗪
1:08:56chr0n0kun Does Vulkan fix the problem with OpenGL of not being able to transfer buffer-objects between processes with separate address spaces?
🗪
1:08:56chr0n0kun Does Vulkan fix the problem with OpenGL of not being able to transfer buffer-objects between processes with separate address spaces?
🗪
1:09:36angus_holder Intel's compiler is meant to be really good, right?
🗪
1:09:36angus_holder Intel's compiler is meant to be really good, right?
🗪
1:09:36angus_holder Intel's compiler is meant to be really good, right?
🗪
1:10:11Andremm2 Can you give us some insight (without breaking any NDAs) how different from OpenGL console graphics APIs are?
🗪
1:10:11Andremm2 Can you give us some insight (without breaking any NDAs) how different from OpenGL console graphics APIs are?
🗪
1:10:11Andremm2 Can you give us some insight (without breaking any NDAs) how different from OpenGL console graphics APIs are?
🗪
1:12:26chr0n0kun Sharing objects between applications without CPU overhead
🗪
1:12:26chr0n0kun Sharing objects between applications without CPU overhead
🗪
1:12:26chr0n0kun Sharing objects between applications without CPU overhead
🗪
1:12:43kknewkles Why do you think there's no games about programming / hardware / history of PC / hardware? The domain is unimaginably rich
🗪
1:12:43kknewkles Why do you think there's no games about programming / hardware / history of PC / hardware? The domain is unimaginably rich
🗪
1:12:43kknewkles Why do you think there's no games about programming / hardware / history of PC / hardware? The domain is unimaginably rich
🗪
1:13:22hguleryuz How does GDDR RAM for GPU or concept of "memory chip designed specifically for GPU" enter into this picture?
🗪
1:13:22hguleryuz How does GDDR RAM for GPU or concept of "memory chip designed specifically for GPU" enter into this picture?
🗪
1:13:22hguleryuz How does GDDR RAM for GPU or concept of "memory chip designed specifically for GPU" enter into this picture?
🗪
1:13:48Blackboard: The gist of GDDR
1:13:48Blackboard: The gist of GDDR
1:13:48Blackboard: The gist of GDDR
1:16:36NoRaD91 Could you one day maybe do a pre / after-stream short summary about your thoughts on OS-design and what you would do differently given current hardware?
🗪
1:16:36NoRaD91 Could you one day maybe do a pre / after-stream short summary about your thoughts on OS-design and what you would do differently given current hardware?
🗪
1:16:36NoRaD91 Could you one day maybe do a pre / after-stream short summary about your thoughts on OS-design and what you would do differently given current hardware?
🗪
1:16:57kknewkles I mustered up one more. How come The Witness has 4GB RAM as minimum requirement? I don't doubt it has great optimization (as Jon is an apex-level programmer). Is it because nowadays everyone has 4 gigs at least and they thought it's unfeasible or too limiting to go below that? What can be the design behind that requirement?
🗪
1:16:57kknewkles I mustered up one more. How come The Witness has 4GB RAM as minimum requirement? I don't doubt it has great optimization (as Jon is an apex-level programmer). Is it because nowadays everyone has 4 gigs at least and they thought it's unfeasible or too limiting to go below that? What can be the design behind that requirement?
🗪
1:16:57kknewkles I mustered up one more. How come The Witness has 4GB RAM as minimum requirement? I don't doubt it has great optimization (as Jon is an apex-level programmer). Is it because nowadays everyone has 4 gigs at least and they thought it's unfeasible or too limiting to go below that? What can be the design behind that requirement?
🗪
1:19:47elxenoaizd Why does it always seem like PC games claim that they require much more hardware power than they need? Do they want the extra power - just in case something goes wrong - or what?
🗪
1:19:47elxenoaizd Why does it always seem like PC games claim that they require much more hardware power than they need? Do they want the extra power - just in case something goes wrong - or what?
🗪
1:19:47elxenoaizd Why does it always seem like PC games claim that they require much more hardware power than they need? Do they want the extra power - just in case something goes wrong - or what?
🗪
1:20:59chr0n0kun Having OS-level support for GPU resources for compute and graphics so that 3D graphics tools etc. can interoperate efficiently, e.g. in VFX production where you have lots of tools using the same data
🗪
1:20:59chr0n0kun Having OS-level support for GPU resources for compute and graphics so that 3D graphics tools etc. can interoperate efficiently, e.g. in VFX production where you have lots of tools using the same data
🗪
1:20:59chr0n0kun Having OS-level support for GPU resources for compute and graphics so that 3D graphics tools etc. can interoperate efficiently, e.g. in VFX production where you have lots of tools using the same data
🗪
1:21:45Longboolean What would be a disadvantage of having a big beefy cpu (if any) or are there none?
🗪
1:21:45Longboolean What would be a disadvantage of having a big beefy cpu (if any) or are there none?
🗪
1:21:45Longboolean What would be a disadvantage of having a big beefy cpu (if any) or are there none?
🗪
1:22:00Boorocks998 Have you seen that guy that is going to recreate Quake in a Handmade Hero style?
🗪
1:22:00Boorocks998 Have you seen that guy that is going to recreate Quake in a Handmade Hero style?
🗪
1:22:00Boorocks998 Have you seen that guy that is going to recreate Quake in a Handmade Hero style?
🗪
1:22:21pankupunka How long do you expect this project to take?
🗪
1:22:21pankupunka How long do you expect this project to take?
🗪
1:22:21pankupunka How long do you expect this project to take?
🗪
1:22:48kknewkles From what I get, Fallout 4 just allocates itself an 8 gig block. Abhorrent
🗪
1:22:48kknewkles From what I get, Fallout 4 just allocates itself an 8 gig block. Abhorrent
🗪
1:22:48kknewkles From what I get, Fallout 4 just allocates itself an 8 gig block. Abhorrent
🗪
1:23:17kil4h Quick question about strict aliasing due to your forum post (not defending it), but how would you propose compilers to understand that pointers do not overlap (to optimize loads and so on)? Not sure how we could improve generated code without having that kind of guarantee, assuming we need to still support old code
🗪
1:23:17kil4h Quick question about strict aliasing due to your forum post (not defending it), but how would you propose compilers to understand that pointers do not overlap (to optimize loads and so on)? Not sure how we could improve generated code without having that kind of guarantee, assuming we need to still support old code
🗪
1:23:17kil4h Quick question about strict aliasing due to your forum post (not defending it), but how would you propose compilers to understand that pointers do not overlap (to optimize loads and so on)? Not sure how we could improve generated code without having that kind of guarantee, assuming we need to still support old code
🗪
1:25:27Andremm2 Out of curiosity, back then when it came out, was GDI just a wrapper for OpenGL?
🗪
1:25:27Andremm2 Out of curiosity, back then when it came out, was GDI just a wrapper for OpenGL?
🗪
1:25:27Andremm2 Out of curiosity, back then when it came out, was GDI just a wrapper for OpenGL?
🗪
1:26:29cubercaleb Restrict doesn't seem to work with vs2013 / 2015
🗪
1:26:29cubercaleb Restrict doesn't seem to work with vs2013 / 2015
🗪
1:26:29cubercaleb Restrict doesn't seem to work with vs2013 / 2015
🗪
1:26:48NoRaD91 Isn't Restrict, like, stupid limited though? And you don't have Alias (one that's definitive at least)
🗪
1:26:48NoRaD91 Isn't Restrict, like, stupid limited though? And you don't have Alias (one that's definitive at least)
🗪
1:26:48NoRaD91 Isn't Restrict, like, stupid limited though? And you don't have Alias (one that's definitive at least)
🗪
1:27:18Neitchzehrer Why is it that games get more difficult to play as Windows OS gets more advanced, i.e. playing a game from Windows XP on Windows 7?
🗪
1:27:18Neitchzehrer Why is it that games get more difficult to play as Windows OS gets more advanced, i.e. playing a game from Windows XP on Windows 7?
🗪
1:27:18Neitchzehrer Why is it that games get more difficult to play as Windows OS gets more advanced, i.e. playing a game from Windows XP on Windows 7?
🗪
1:28:00Wind down
🗩
1:28:00Wind down
🗩
1:28:00Wind down
🗩