[, < / ], >Jump to previous / next episode W, K, P / S, J, NJump to previous / next timestamp t / TToggle theatre / SUPERtheatre mode VRevert filter to original stateYSelect link (requires manual Ctrl-c)
Menu toggling
qQuotesrReferencesfFilteryLinkcCredits
In-Menu and Index Controls
a
w s
d
hjkl
←
↑ ↓
→
EscClose menu / unfocus timestamp
Quotes and References Menus and Index
EnterJump to timestamp
Quotes, References and Credits Menus
oOpen URL (in new tab)
Filter Menu
x, SpaceToggle category and focus next X, ShiftSpaceToggle category and focus previous vInvert topics / media as per focus
1:26Default LANE_WIDTH to 8 if not defined on our build line
1:26Default LANE_WIDTH to 8 if not defined on our build line
1:26Default LANE_WIDTH to 8 if not defined on our build line
1:47Check out our 2017_10_22_Day01.txt performance dump
🗹
1:47Check out our 2017_10_22_Day01.txt performance dump
🗹
1:47Check out our 2017_10_22_Day01.txt performance dump
🗹
2:37Admire our 2017_11_19_Day08_8x.bmp image
🎨
2:37Admire our 2017_11_19_Day08_8x.bmp image
🎨
2:37Admire our 2017_11_19_Day08_8x.bmp image
🎨
3:34Determine to add some importance sampling and / or material properties
📖
3:34Determine to add some importance sampling and / or material properties
📖
3:34Determine to add some importance sampling and / or material properties
📖
4:27Try to run the ray caster
🏃
4:27Try to run the ray caster
🏃
4:27Try to run the ray caster
🏃
5:59Reacquaint ourselves with CastSampleRays()
📖
5:59Reacquaint ourselves with CastSampleRays()
📖
5:59Reacquaint ourselves with CastSampleRays()
📖
10:06Consider adding some exposure control
📖
10:06Consider adding some exposure control
📖
10:06Consider adding some exposure control
📖
10:57Try increasing the redness of the 4th material's EmitColor from 4 to 20
10:57Try increasing the redness of the 4th material's EmitColor from 4 to 20
10:57Try increasing the redness of the 4th material's EmitColor from 4 to 20
11:22Run the program
🏃
11:22Run the program
🏃
11:22Run the program
🏃
11:37Admire our image with more red emittance
🎨
11:37Admire our image with more red emittance
🎨
11:37Admire our image with more red emittance
🎨
12:45Try increasing the redness of the 4th material's EmitColor from 20 to 50
12:45Try increasing the redness of the 4th material's EmitColor from 20 to 50
12:45Try increasing the redness of the 4th material's EmitColor from 20 to 50
12:52Run the program
🏃
12:52Run the program
🏃
12:52Run the program
🏃
13:02Admire our image with yet more red emittance
🎨
13:02Admire our image with yet more red emittance
🎨
13:02Admire our image with yet more red emittance
🎨
14:17Introduce a second plane
14:17Introduce a second plane
14:17Introduce a second plane
14:47Run the program
🏃
14:47Run the program
🏃
14:47Run the program
🏃
15:06Admire our image with two planes
🎨
15:06Admire our image with two planes
🎨
15:06Admire our image with two planes
🎨
15:52Try decreasing the d of the 2nd plane from 10 to 2
15:52Try decreasing the d of the 2nd plane from 10 to 2
15:52Try decreasing the d of the 2nd plane from 10 to 2
15:57Run the program
🏃
15:57Run the program
🏃
15:57Run the program
🏃
16:29Admire our image with the second plane cutting through the blue hemisphere
🎨
16:29Admire our image with the second plane cutting through the blue hemisphere
🎨
16:29Admire our image with the second plane cutting through the blue hemisphere
🎨
17:11Determine to vary the light based on direction
🎨
17:11Determine to vary the light based on direction
🎨
17:11Determine to vary the light based on direction
🎨
17:44Try adding some Green and Blue to the 4th material's EmitColor, and take RaysPerPixel as a command line argument
17:44Try adding some Green and Blue to the 4th material's EmitColor, and take RaysPerPixel as a command line argument
17:44Try adding some Green and Blue to the 4th material's EmitColor, and take RaysPerPixel as a command line argument
18:13Run the program with and without passing a RaysPerPixel
🏃
18:13Run the program with and without passing a RaysPerPixel
🏃
18:13Run the program with and without passing a RaysPerPixel
🏃
18:32Admire our image with orange emittance
🎨
18:32Admire our image with orange emittance
🎨
18:32Admire our image with orange emittance
🎨
19:02Research Disney's standard BRDF (Bidirectional Reflectance Distribution Function)1,2,3
📖
19:02Research Disney's standard BRDF (Bidirectional Reflectance Distribution Function)1,2,3
📖
19:02Research Disney's standard BRDF (Bidirectional Reflectance Distribution Function)1,2,3
📖
25:10Research reference implementations of Disney's Principled BRDF4,5,6
📖
25:10Research reference implementations of Disney's Principled BRDF4,5,6
📖
25:10Research reference implementations of Disney's Principled BRDF4,5,6
📖
27:34Prepare to proceed, reading 2 "The microfacet model"7
📖
27:34Prepare to proceed, reading 2 "The microfacet model"7
📖
27:34Prepare to proceed, reading 2 "The microfacet model"7
📖
31:21Read 3 "Visualizing measured BDRFs",8 describing a gonioreflectometer9 and checking out MERL's BRDF database10
📖
31:21Read 3 "Visualizing measured BDRFs",8 describing a gonioreflectometer9 and checking out MERL's BRDF database10
📖
31:21Read 3 "Visualizing measured BDRFs",8 describing a gonioreflectometer9 and checking out MERL's BRDF database10
📖
39:21Read 'A Data-Driven Reflectance Model'11
📖
39:21Read 'A Data-Driven Reflectance Model'11
📖
39:21Read 'A Data-Driven Reflectance Model'11
📖
46:56Consult the BRDF reference implementation in MERL's BRDF database12
📖
46:56Consult the BRDF reference implementation in MERL's BRDF database12
📖
46:56Consult the BRDF reference implementation in MERL's BRDF database12
📖
47:52Embark on our BRDF loader, introducing brdf_table and LoadMERLBinary()13
47:52Embark on our BRDF loader, introducing brdf_table and LoadMERLBinary()13
47:52Embark on our BRDF loader, introducing brdf_table and LoadMERLBinary()13
54:27Organise our MERL .binary files
🗹
54:27Organise our MERL .binary files
🗹
54:27Organise our MERL .binary files
🗹
55:28Set up to call LoadMERLBinary(), changing it to handle three colour channels14
55:28Set up to call LoadMERLBinary(), changing it to handle three colour channels14
55:28Set up to call LoadMERLBinary(), changing it to handle three colour channels14
1:00:19Run the program, to see no progress printout
🏃
1:00:19Run the program, to see no progress printout
🏃
1:00:19Run the program, to see no progress printout
🏃
1:03:33Run the program with 16 RaysPerPixel
🏃
1:03:33Run the program with 16 RaysPerPixel
🏃
1:03:33Run the program with 16 RaysPerPixel
🏃
1:03:55Admire our test.bmp
🎨
1:03:55Admire our test.bmp
🎨
1:03:55Admire our test.bmp
🎨
1:04:12Determine to sample the BRDF values, first learning about the colour scaling values15,16
📖
1:04:12Determine to sample the BRDF values, first learning about the colour scaling values15,16
📖
1:04:12Determine to sample the BRDF values, first learning about the colour scaling values15,16
📖
1:08:41Change LoadMERLBinary() to read in the colour channels as per the reference implementation17
1:08:41Change LoadMERLBinary() to read in the colour channels as per the reference implementation17
1:08:41Change LoadMERLBinary() to read in the colour channels as per the reference implementation17
1:11:09Upgrade CastSampleRays() to attenuate by the BRDF-sampled reflectance colour, introducing BRDFLookup()
1:11:09Upgrade CastSampleRays() to attenuate by the BRDF-sampled reflectance colour, introducing BRDFLookup()
1:11:09Upgrade CastSampleRays() to attenuate by the BRDF-sampled reflectance colour, introducing BRDFLookup()
1:15:26Implement BRDFLookup(), informed by the reference implementation18 and 'A Data-Driven Reflectance Model'19
1:15:26Implement BRDFLookup(), informed by the reference implementation18 and 'A Data-Driven Reflectance Model'19
1:15:26Implement BRDFLookup(), informed by the reference implementation18 and 'A Data-Driven Reflectance Model'19
1:29:05Coordinate System on a Sphere20,21
🖌
1:29:05Coordinate System on a Sphere20,21
🖌
1:29:05Coordinate System on a Sphere20,21
🖌
1:33:14Make CastSampleRays() generate a coordinate system for our sampling sphere, to pass to BRDFLookup()
1:33:14Make CastSampleRays() generate a coordinate system for our sampling sphere, to pass to BRDFLookup()
1:33:14Make CastSampleRays() generate a coordinate system for our sampling sphere, to pass to BRDFLookup()
1:37:14Consult our documents22,23 on computation of the diff vector
📖
1:37:14Consult our documents22,23 on computation of the diff vector
📖
1:37:14Consult our documents22,23 on computation of the diff vector
📖
1:39:22Make BRDFLookup() compute ThetaHalf and PhiHalf, after mapping the LightDir and HalfVector in to the tangent space of the reflection
1:39:22Make BRDFLookup() compute ThetaHalf and PhiHalf, after mapping the LightDir and HalfVector in to the tangent space of the reflection
1:39:22Make BRDFLookup() compute ThetaHalf and PhiHalf, after mapping the LightDir and HalfVector in to the tangent space of the reflection
1:43:36Make BRDFLookup() compute ThetaDiff24
1:43:36Make BRDFLookup() compute ThetaDiff24
1:43:36Make BRDFLookup() compute ThetaDiff24
1:45:44Come to understand the PhiDiff computation25,26,27
📖
1:45:44Come to understand the PhiDiff computation25,26,27
📖
1:45:44Come to understand the PhiDiff computation25,26,27
📖
1:49:30Rename Bitangent to Binormal in plane
1:49:30Rename Bitangent to Binormal in plane
1:49:30Rename Bitangent to Binormal in plane
1:51:45Locating a (Light) Vector in a Transformed Coordinate System28
🖌
1:51:45Locating a (Light) Vector in a Transformed Coordinate System28
🖌
1:51:45Locating a (Light) Vector in a Transformed Coordinate System28
🖌
1:57:52Efficiently locating a (Light) Vector in a Transformed Coordinate System29
🖌
1:57:52Efficiently locating a (Light) Vector in a Transformed Coordinate System29
🖌
1:57:52Efficiently locating a (Light) Vector in a Transformed Coordinate System29
🖌
2:04:37Make BRDFLookup() efficiently compute the PhiDiff
2:04:37Make BRDFLookup() efficiently compute the PhiDiff
2:04:37Make BRDFLookup() efficiently compute the PhiDiff
2:10:10Introduce ExtractF32() for BRDFLookup() to call30
2:10:10Introduce ExtractF32() for BRDFLookup() to call30
2:10:10Introduce ExtractF32() for BRDFLookup() to call30
2:13:51Make BRDFLookup() look up the colour values31
2:13:51Make BRDFLookup() look up the colour values31
2:13:51Make BRDFLookup() look up the colour values31
2:21:57Begin to make BRDFLookup() index in to the BRDF table32
2:21:57Begin to make BRDFLookup() index in to the BRDF table32
2:21:57Begin to make BRDFLookup() index in to the BRDF table32
2:25:26Wonder if the squaring and rooting is correct33
📖
2:25:26Wonder if the squaring and rooting is correct33
📖
2:25:26Wonder if the squaring and rooting is correct33
📖
2:28:34I = √t × R²
🖌
2:28:34I = √t × R²
🖌
2:28:34I = √t × R²
🖌
2:30:14Finish making BRDFLookup() index in to the BRDF table34
2:30:14Finish making BRDFLookup() index in to the BRDF table34
2:30:14Finish making BRDFLookup() index in to the BRDF table34
2:37:16Run the program to find uninitialised Values
🏃
2:37:16Run the program to find uninitialised Values
🏃
2:37:16Run the program to find uninitialised Values
🏃
2:38:41Introduce NullBRDF()
2:38:41Introduce NullBRDF()
2:38:41Introduce NullBRDF()
2:40:25Run the program
🏃
2:40:25Run the program
🏃
2:40:25Run the program
🏃
2:40:39Admire our test.bmp
🎨
2:40:39Admire our test.bmp
🎨
2:40:39Admire our test.bmp
🎨
2:41:06Q&A
🗩
2:41:06Q&A
🗩
2:41:06Q&A
🗩
2:41:48xxthebigfoxx Q: When computing the sphere bitangent, could you just NOZ the tangent before computing the Bitangent so you don't have to NOZ the Bitangent afterwards?
🗪
2:41:48xxthebigfoxx Q: When computing the sphere bitangent, could you just NOZ the tangent before computing the Bitangent so you don't have to NOZ the Bitangent afterwards?
🗪
2:41:48xxthebigfoxx Q: When computing the sphere bitangent, could you just NOZ the tangent before computing the Bitangent so you don't have to NOZ the Bitangent afterwards?
🗪
2:42:22Save BRDFLookup() from performing NOZ() on the DiffX, and make it perform NOZ() when initialising SphereTangent to save doing in on the SphereBinormal
2:42:22Save BRDFLookup() from performing NOZ() on the DiffX, and make it perform NOZ() when initialising SphereTangent to save doing in on the SphereBinormal
2:42:22Save BRDFLookup() from performing NOZ() on the DiffX, and make it perform NOZ() when initialising SphereTangent to save doing in on the SphereBinormal
2:43:28somebody_took_my_name Q: Talking about ray and Linux and more beautiful code, there is still a GitHub issue about memory loading in SIMD
🗪
2:43:28somebody_took_my_name Q: Talking about ray and Linux and more beautiful code, there is still a GitHub issue about memory loading in SIMD
🗪
2:43:28somebody_took_my_name Q: Talking about ray and Linux and more beautiful code, there is still a GitHub issue about memory loading in SIMD
🗪
2:43:40pythno Q: Is the BRDS essentially a function that tells us which rays within the hemisphere to use?
🗪
2:43:40pythno Q: Is the BRDS essentially a function that tells us which rays within the hemisphere to use?
🗪
2:43:40pythno Q: Is the BRDS essentially a function that tells us which rays within the hemisphere to use?
🗪
2:43:53smack_ssbm Q: Do you get "uncanny valley" impressions sometimes when looking at raytraced images? If so, what strategies might you have to mitigate that?
🗪
2:43:53smack_ssbm Q: Do you get "uncanny valley" impressions sometimes when looking at raytraced images? If so, what strategies might you have to mitigate that?
🗪
2:43:53smack_ssbm Q: Do you get "uncanny valley" impressions sometimes when looking at raytraced images? If so, what strategies might you have to mitigate that?
🗪
2:44:28pythno Q: BRDF, yes, typo, sorry
🗪
2:44:28pythno Q: BRDF, yes, typo, sorry
🗪
2:44:28pythno Q: BRDF, yes, typo, sorry
🗪
2:45:21xxthebigfoxx Q: Will you finish this tomorrow?
🗪
2:45:21xxthebigfoxx Q: Will you finish this tomorrow?
🗪
2:45:21xxthebigfoxx Q: Will you finish this tomorrow?
🗪
2:45:31sagian2005 Q: The sound was fine and it was synced. Your camera frame rate seems low, though
🗪
2:45:31sagian2005 Q: The sound was fine and it was synced. Your camera frame rate seems low, though
🗪
2:45:31sagian2005 Q: The sound was fine and it was synced. Your camera frame rate seems low, though
🗪
2:45:53tocsin16 Q: Have you thought about implementing GGX for Handmade Ray?
🗪
2:45:53tocsin16 Q: Have you thought about implementing GGX for Handmade Ray?
🗪
2:45:53tocsin16 Q: Have you thought about implementing GGX for Handmade Ray?
🗪
2:46:52tinspin Q: How much of modern skin mesh animation did you contribute to?
🗪
2:46:52tinspin Q: How much of modern skin mesh animation did you contribute to?
🗪
2:46:52tinspin Q: How much of modern skin mesh animation did you contribute to?
🗪
2:47:24tocsin16 Q: Haha, this is what I'm talking about35
🗪
2:47:24tocsin16 Q: Haha, this is what I'm talking about35
🗪
2:47:24tocsin16 Q: Haha, this is what I'm talking about35
🗪
2:49:26pythno Q: Alright, thanks. But what function chooses the outgoing rays to use? Because it will affect the outcome of the picture. It gets fuzzier if the rays spread in more directions
🗪
2:49:26pythno Q: Alright, thanks. But what function chooses the outgoing rays to use? Because it will affect the outcome of the picture. It gets fuzzier if the rays spread in more directions
🗪
2:49:26pythno Q: Alright, thanks. But what function chooses the outgoing rays to use? Because it will affect the outcome of the picture. It gets fuzzier if the rays spread in more directions
🗪
2:49:50Equally vs BRDF weighted sampling
🖌
2:49:50Equally vs BRDF weighted sampling
🖌
2:49:50Equally vs BRDF weighted sampling
🖌
2:53:21tinspin Q: Did shader weighted skin mesh animation exist before you started meddling with it? How did it evolve, and do you see any future improvements possible on it?
🗪
2:53:21tinspin Q: Did shader weighted skin mesh animation exist before you started meddling with it? How did it evolve, and do you see any future improvements possible on it?
🗪
2:53:21tinspin Q: Did shader weighted skin mesh animation exist before you started meddling with it? How did it evolve, and do you see any future improvements possible on it?
🗪
2:56:36tocsin16 Q: I see missed the start of the stream so didn't realize the dataset was a requirement. I don't know if there is set of measured GGX BRDFs. It is a very common BRDF used in offline pathtracing for film (and I think gaining popularity in the realtime world too). Thanks for the education streams!
🗪
2:56:36tocsin16 Q: I see missed the start of the stream so didn't realize the dataset was a requirement. I don't know if there is set of measured GGX BRDFs. It is a very common BRDF used in offline pathtracing for film (and I think gaining popularity in the realtime world too). Thanks for the education streams!
🗪
2:56:36tocsin16 Q: I see missed the start of the stream so didn't realize the dataset was a requirement. I don't know if there is set of measured GGX BRDFs. It is a very common BRDF used in offline pathtracing for film (and I think gaining popularity in the realtime world too). Thanks for the education streams!
🗪
2:57:14ryanfleury Was the game Messiah?
🗪
2:57:14ryanfleury Was the game Messiah?
🗪
2:57:14ryanfleury Was the game Messiah?
🗪
2:58:26Wrap it up
🗩
2:58:26Wrap it up
🗩
2:58:26Wrap it up
🗩
•
You have arrived at the (current) end of Handmade Ray