Music Visualization in VR – Using Web Audio API in WebVR Tutorial
Music Visualizations in VR Using Web Audio API
In this tutorial, we’ll learn how to create music visualizations in WebVR. We’ll use the web audio API to load the audio. We’ll use BoxGeometry
to create the boxes for visualization.
This is part of #DaysInVR series. View All VR Projects. Yesterday we learnt how to use objects and lights to create shopping in VR. Today, we’ll create music visualization in vr.
Join 6000+ Students
Upgrade your programming skills to include Virtual Reality. A premium step-by-step training course to get you building real world Virtual Reality apps and website components. You’ll have access to 40+ WebVR & Unity lessons along with their source code.
Start Learning Now
Loading audio using web audio API in WebVR
We’ll use FileLoader
to load the file as arraybuffer. Once the file is loaded, we’ll use an audio analyzer to get the frequency data of the audio. We’ll use this value later to animate the bars we create based on the audio.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
var audioContext = new AudioContext(); var analyser = audioContext.createAnalyser(); function setupAudioProcessing () { //create the javascript node var javascriptNode = audioContext.createScriptProcessor(2048, 1, 1); javascriptNode.connect(audioContext.destination); //create the source buffer sourceBuffer = audioContext.createBufferSource(); analyser.smoothingTimeConstant = 0.3; analyser.fftSize = 512; //connect source to analyser sourceBuffer.connect(analyser); sourceBuffer.loop = true; //analyser to speakers analyser.connect(javascriptNode); //connect source to analyser sourceBuffer.connect(audioContext.destination); // audio file var fileLoader = new THREE.FileLoader(); fileLoader.setResponseType('arraybuffer'); var filebuffer; fileLoader.load('/projects/day15/dance.mp3', function(buffer) { filebuffer = buffer; start(filebuffer); }); }; |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
function start (buffer) { audioContext.decodeAudioData(buffer, decodeAudioDataSuccess, decodeAudioDataFailed); function decodeAudioDataSuccess(decodedBuffer) { sourceBuffer.buffer = decodedBuffer; sourceBuffer.start(0); } function decodeAudioDataFailed() { debugger } }; |
Adding bars for music visualization
We’ll create bars using BoxGeometry
of Three.js and we’ll use the frequency information from audio analyzer that we create earlier to create the visualization.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
for (var i = 0; i < numberOfBars; i++) { //create a bar var barGeometry = new THREE.BoxGeometry(0.5, 0.5, 0.5); //create a material var material = new THREE.MeshPhongMaterial({ color: getRandomColor(), ambient: 0x808080, specular: 0xffffff }); //create the geometry and set the initial position bars[i] = new THREE.Mesh(barGeometry, material); bars[i].position.set(i - numberOfBars/2, 0, 0); //add the created bar to the scene scene.add(bars[i]); } |
Now to complete the visualizations, we need to add the following to our animate()
function. So based on the frequency, each bar will change its size.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
// get the average for the first channel var array = new Uint8Array(analyser.frequencyBinCount); analyser.getByteFrequencyData(array); var step = Math.round(array.length / numberOfBars); //Iterate through the bars and scale the z axis for (var i = 0; i < numberOfBars; i++) { var value = array[i * step] / 4; value = value < 1 ? 1 : value; bars[i].scale.z = value; } |
It’s your turn
Create your own version of music visualization in VR. Try changing the parameters and see what you can come up with. Don’t forget to share your experiences below in the comments.
Download Source Code
Kickstart VR development by downloading source code for this project. You’ll have instant access to source code and asset files. You can use them in your personal or commercial projects.
Leave a Reply