Video collaborative livecoding: Three.JS, and Interactive Shader Format (Mar 29 2022)

Big video update today: collaboratively livecode 3d scenes in three.js, and then process them with shaders written in ISF.

To get started, start a new session. If you want to collaboratively live-code with someone else, you will have to sign up so that your session is a live server session and not just in your own browser.

We will add video WAMs to Track 1. First double click in the first scene to create a clip in track 1, then be sure to start playing it by pressing the clip’s “play” button.

Then click once on the clip to go to the track page.

The three video devices are currently under “Add Effect”. They are:

  • Video Input, which streams your default camera
  • ThreeJS Scene, which renders a threeJS scene that is coded as part of the patch
  • ISF Shader, which loads a shader in ISF (Interactive Shader Format). This is basically a GLSL shader with parameters. See https://isf.video for more details and many examples you can use.

So for example, add a ThreeJS Scene and an ISF video scene, and you should see the distorted cube at the bottom of the track page.

If you are using a server session and want to add other people, first adjust the settings under the ‘Share’ menu:

image

STATE permission lets your friends turn knobs and edit the code, but they will not be able to add or remove any plugins from the session.

Then copy and send them the URL to the session. They should be able to join, even without an account.

Load Presets
In the near future there will be a selection of cool video patches available as presets. Just right click on the plugin’s header and select “load preset”.

The ISF plugin supports any shader in the ISF format. There are hundreds of free ones available from here: https://editor.isf.video

Editing The Scene
If you aren’t a programmer, skip this part :slight_smile:

Press the ‘CODE’ button on the ThreeJS scene, you should see a text editor and some javascript that defines the current video scene. There’s currently three methods:

  • parameters returns a list of parameters. This will power the knobs on the GUI and let the host automate or modulate these controls.
  • initialize use this for one-time initialization code, creating the scene objects
  • render this is called once for every frame. Somewhere in here you should run renderer.render to actually render the threejs scene.

Whats Next

  • I’m moving the video output off of the single track, and onto the entire session. Each track’s video will get merged with the previous track’s work, so you will be able to build the final video in layers. As well, you will be able to control the output video size and not be fixed to 640x480.

  • Right now when loading presets, you only have access to your own presets, but i’m working on global browsing of all public projects.

  • The monaco editor within the livecoders does not autocomplete for the types of the scene object, and does not autocomplete for the three.js library yet.

If you run into trouble or encounter bugs, please reply here. Thanks!