aframe实现三维玻璃立方体视差着色器代码
代码语言:html
所属分类:三维
代码描述:aframe实现三维玻璃立方体视差着色器代码
代码标签: aframe 三维 玻璃 立方体 视差 着色器 代码
下面为部分代码预览,完整代码请点击下载或在bfwstudio webide中打开
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<script>console.clear();</script>
<script type="text/javascript" src="//repo.bfw.wiki/bfwrepo/js/aframe.1.3.0.js"></script>
<script type="text/javascript" src="//repo.bfw.wiki/bfwrepo/js/aframe-orbit-controls.1.3.0.js"></script>
<style>
.tip {
position: fixed;
top: 4px;
z-index: 1;
pointer-events: none;
width: 100%;
padding: 12px;
font-family: system-ui, sans-serif;
font-size: clamp(0.75rem, 3.5vw, 1.25rem);
text-align: center;
color: #fff;
opacity: 0.25;
}
</style>
</head>
<body >
<script>
AFRAME.registerComponent('glass-cube', {
schema: {},
init() {
const mesh = this.el.getObject3D('mesh');
mesh.geometry.computeTangents();
mesh.material = this.generateMaterial();
},
generateMaterial() {
const colorTex = new THREE.TextureLoader().load('//repo.bfw.wiki/bfwrepo/images/cube/colored_squares.png');
const normalTex = new THREE.TextureLoader().load('//repo.bfw.wiki/bfwrepo/images/cube/glass_frosted_normal_tex.jpg');
// const normalTex = new THREE.TextureLoader().load('//repo.bfw.wiki/bfwrepo/images/cube/tiles_normal_map.jpg');
return new THREE.ShaderMaterial({
uniforms: {
u_colorTex: { type: 't', value: colorTex },
u_normalTex: { type: 't', value: normalTex }
},
// I learned a lot about normal/parallax mapping from this article + demo:
// https://apoorvaj.io/exploring-bump-mapping-with-webgl/
vertexShader: `
// THREE doesn't seem to add the "tangent" attribute automatically when we compute the tangents.
// The attribute data is there, we just need to declare it here.
attribute vec4 tangent;
varying vec3 v_normal;
varying vec2 v_uv;
varying vec3 v_fragPos;
varying vec3 v_viewPos;
varying vec3 v_lightPos;
void main() {
// Compute the bitangent at runtime - it's not too expensive per-vertex.
// Often this is another vertex attribute, but "BufferGeometry.computeTangents()" doesn't store it as an attribute.
// For now, I didn't feel like figuring out the best way to compute these on the CPU, but it could be done :)
vec3 bitangent = cross(normal, tangent.xyz);
// Now that we have all three tangent space basis vectors, transform them to align with the model transform.
vec3 t = normalize(normalMatrix * tangent.xyz);
vec3 b = normalize(normalMatrix * bitangent);
vec3 n = normalize(normalMatrix * normal);
// Finally, generate a 3x3 matrix that converts from world space to tangent space.
// Apparently the base matrix goes from tangent space to world space, and to go the other direction
// we have to take the inverse of it. However, since we don't have any shearing/skewing happening,
// an equivalent operation is the transpose, which is faster. If this doesn't work for a future project,
// just be aware that what we really need might be an inverse 3x3 matrix function.
// I didn't know there is a built-in transpose function, but we get an error if we define our own.
// Maybe threejs is providing this? If there are ever any errors porting this code (e.g. to regl),
// we may need to define a custom transpose() - I have an implementation commented up above.
mat3 tbn = transpose(mat3(t, b, n));
// The following code is designed to work in world space.
// However it turns out the normalMatrix three.js provides is based on the modelViewMatrix, not just the modelMatrix.
// So, we actually need to convert everything into view space, not world space, to be compatible with the given normalMatrix.
// If I ever build a similar system myself, I may use a world-relative normal matrix and world space coordinates.
// ------------------------------------------------------
// vec3 vertPositionWorld = (modelMatrix * vec4(position, 1.0)).xyz;
// vec3 lightPosWorld = vec3(1.0, 4.0, 0.0);
// v_uv = uv;
// v_fragPos = tbn * vertPositionWorld;
// v_viewPos = tbn * cameraPosition;
// v_lightPos = tbn * lightPosWorld;
// FIXED VERSION CONVERTING EVERYTHING TO VIEW SPACE
// ------------------------------------------------------
vec3 vertPositionView = (modelViewMatrix * vec4(position, 1.0)).xyz;
vec3 lightPosWorld = vec3(1.0, 4.0, 0.0);
vec3 lightPosView = (viewMatrix * vec4(lightPosWorld, 1.0)).xyz;
// Local UV coordinates should not be modified.
v_uv = uv;
// All world space values must be transformed into tangent space.
v_fragPos = tbn * vertPositionView;
v_viewPos = vec3(0.0, 0.0, 0.0); // in view space, camera is always at (0, 0, 0)
v_lightPos = tbn * lightPosView;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
`,
fragmentShader: `
uniform sampler2D u_colorTex;
uniform sampler2D u_normalTex;
varying vec2 v_uv;
varying vec3 v_fragPos;
varying vec3 v_viewPos;
varying vec3 v_lightPos;
/**
* Lighting contribution of a single point light source via Phong illumination.
*
* The vec3 returned is the RGB color of the light's contribution.
*
* k_d: Diffuse color
* k_s: Specular color
* alpha: Shininess coefficient
* p: position of point being lit
* n: normal of point being lit
* eye: the position of the camera
* lightPos: the position of the light
* lightIntensity: color/intensity of the light
*
* See https://en.wikipedia.org/wiki/Phong_reflection_model#Description
*/
vec3 phongContribForLight(vec3 k_d, vec3 k_s, float alpha, vec3 p, vec3 n, vec3 eye, vec3 lightPos, vec3 lightIntensity) {
vec3 N = n;
vec3 L = normalize(lightPos - p);
vec3 V = normalize(eye - p);
vec3 R = normalize(reflect(-L, N));
float dotLN = dot(L, N);
float dotRV = dot(R, V);
if (dotLN < 0.0) {
// Light not visible from this point on the surface
return vec3(0.0, 0.0, 0.0);
}
// Clamp added to prevent harsh lighting transitions at high reflection angles
dotLN = clamp(dotLN, 0.0, 1.0);
if (dotRV < 0.0) {
// Light reflection in opposite direction as viewer, apply only diffuse component
return lightIntensity * (k_d * dotLN);
}
return lightIntensity * (k_d * dotLN + k_s * pow(dotRV, alpha));
}
void main() {
vec3 viewDir = normalize(v_viewPos - v_fragPos);
vec3 lightDir = normalize(v_lightPos - v_fragPos);
vec2 uv = v_uv;
// For the bumpy surface normal, sample our normal map.
// This uses the standard surface UV coordinates, because the t.........完整代码请登录后点击上方下载按钮下载查看
网友评论0