new module:VideoContext(canvas, initErrorCallbackopt, optionsopt)
Initialise the VideoContext and render to the specific canvas. A 2nd parameter can be passed to the constructor which is a function that get's called if the VideoContext fails to initialise.
Parameters:
Name | Type | Attributes | Description | ||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
canvas |
Canvas | the canvas element to render the output to. |
|||||||||||||||||||||||||||||||
initErrorCallback |
function |
<optional> |
a callback for if initialising the canvas failed. |
||||||||||||||||||||||||||||||
options |
Object |
<optional> |
a number of custom options which can be set on the VideoContext, generally best left as default. Properties
|
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement, () => console.error("Sorry, your browser dosen\'t support WebGL"));
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
Members
currentTime
Set the progress through the internal timeline. Setting this can be used as a way to implement a scrubbable timeline.
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(20);
ctx.currentTime = 10; // seek 10 seconds in
ctx.play();
currentTime
Get how far through the internal timeline has been played.
Getting this value will give the current playhead position. Can be used for updating timelines.
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
setTimeout(() => console.log(ctx.currentTime),1000); //should print roughly 1.0
destination
Get the final node in the render graph which represents the canvas to display content on to.
This proprety is read-only and there can only ever be one destination node. Other nodes can connect to this but you cannot connect this node to anything.
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.start(0);
videoNode.stop(10);
videoNode.connect(ctx.destination);
duration
Get the time at which the last node in the current internal timeline finishes playing.
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
console.log(ctx.duration); //prints 0
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
console.log(ctx.duration); //prints 10
ctx.play();
element
Get the canvas that the VideoContext is using.
- Source:
id
Returns an ID assigned to the VideoContext instance. This will either be the same id as the underlying canvas element, or a uniquely generated one.
- Source:
id
Set the ID of the VideoContext instance. This should be unique.
- Source:
playbackRate
Set the playback rate of the VideoContext instance. This will alter the playback speed of all media elements played through the VideoContext.
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.start(0);
videoNode.stop(10);
videoNode.connect(ctx.destination);
ctx.playbackRate = 2;
ctx.play(); // Double playback rate means this will finish playing in 5 seconds.
playbackRate
Return the current playbackRate of the video context.
- Source:
state
Get the current state.
- Source:
volume
Set the volume of all MediaNode created in the VideoContext.
- Source:
volume
Return the current volume of the video context.
- Source:
Methods
audio(src, sourceOffsetopt, preloadTimeopt, imageElementAttributesopt) → {AudioNode}
Create a new node representing an audio source
Parameters:
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
src |
string | HTMLAudioElement | MediaStream | The url or audio element to create the audio node from. |
||
sourceOffset |
number |
<optional> |
0 | Offset into the start of the source audio to start playing from. |
preloadTime |
number |
<optional> |
4 | How long before a node is to be displayed to attmept to load it. |
imageElementAttributes |
Object |
<optional> |
Any attributes to be given to the underlying image element. |
- Source:
Returns:
A new audio node.
- Type
- AudioNode
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var audioNode = ctx.audio("ziggystardust.mp3");
canvas(src) → {CanvasNode}
Create a new node representing a canvas source
Parameters:
Name | Type | Description |
---|---|---|
src |
Canvas | The canvas element to create the canvas node from. |
- Source:
Returns:
A new canvas node.
- Type
- CanvasNode
compositor(definition) → {CompositingNode}
Create a new compositiing node.
Compositing nodes are used for operations such as combining multiple video sources into a single track/connection for further processing in the graph.
A compositing node is slightly different to other processing nodes in that it only has one input in it's definition but can have unlimited connections made to it. The shader in the definition is run for each input in turn, drawing them to the output buffer. This means there can be no interaction between the spearte inputs to a compositing node, as they are individually processed in seperate shader passes.
Parameters:
Name | Type | Description |
---|---|---|
definition |
Object | this is an object defining the shaders, inputs, and properties of the compositing node to create. Builtin definitions can be found by accessing VideoContext.DEFINITIONS |
- Source:
Returns:
A new compositing node created from the passed definition.
- Type
- CompositingNode
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
//A simple compositing node definition which just renders all the inputs to the output buffer.
var combineDefinition = {
vertexShader : "\
attribute vec2 a_position;\
attribute vec2 a_texCoord;\
varying vec2 v_texCoord;\
void main() {\
gl_Position = vec4(vec2(2.0,2.0)*vec2(1.0, 1.0), 0.0, 1.0);\
v_texCoord = a_texCoord;\
}",
fragmentShader : "\
precision mediump float;\
uniform sampler2D u_image;\
uniform float a;\
varying vec2 v_texCoord;\
varying float v_progress;\
void main(){\
vec4 color = texture2D(u_image, v_texCoord);\
gl_FragColor = color;\
}",
properties:{
"a":{type:"uniform", value:0.0},
},
inputs:["u_image"]
};
//Create the node, passing in the definition.
var trackNode = videoCtx.compositor(combineDefinition);
//create two videos which will play at back to back
var videoNode1 = ctx.video("video1.mp4");
videoNode1.play(0);
videoNode1.stop(10);
var videoNode2 = ctx.video("video2.mp4");
videoNode2.play(10);
videoNode2.stop(20);
//Connect the nodes to the combine node. This will give a single connection representing the two videos which can
//be connected to other effects such as LUTs, chromakeyers, etc.
videoNode1.connect(trackNode);
videoNode2.connect(trackNode);
//Don't do anything exciting, just connect it to the output.
trackNode.connect(ctx.destination);
createCanvasSourceNode()
- Deprecated:
- Yes
- Source:
createCompositingNode()
- Source:
createEffectNode()
- Deprecated:
- Yes
- Source:
createImageSourceNode()
- Deprecated:
- Yes
- Source:
createTransitionNode()
- Deprecated:
- Yes
- Source:
createVideoSourceNode()
- Deprecated:
- Yes
- Source:
customSourceNode(CustomSourceNode, src, …options)
Instanciate a custom built source node
Parameters:
Name | Type | Attributes | Description |
---|---|---|---|
CustomSourceNode |
SourceNode | ||
src |
Object | ||
options |
any |
<repeatable> |
- Source:
effect(definition) → {EffectNode}
Create a new effect node.
Parameters:
Name | Type | Description |
---|---|---|
definition |
Object | this is an object defining the shaders, inputs, and properties of the compositing node to create. Builtin definitions can be found by accessing VideoContext.DEFINITIONS. |
- Source:
Returns:
A new effect node created from the passed definition
- Type
- EffectNode
image(src, preloadTimeopt, imageElementAttributesopt) → {ImageNode}
Create a new node representing an image source
Parameters:
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
src |
string | Image | ImageBitmap | The url or image element to create the image node from. |
||
preloadTime |
number |
<optional> |
4 | How long before a node is to be displayed to attmept to load it. |
imageElementAttributes |
Object |
<optional> |
Any attributes to be given to the underlying image element. |
- Source:
Returns:
A new image node.
- Type
- ImageNode
Examples
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var imageNode = ctx.image("image.png");
var canvasElement = document.getElementById("canvas");
var imageElement = document.getElementById("image");
var ctx = new VideoContext(canvasElement);
var imageNode = ctx.image(imageElement);
pause()
Pause playback of the VideoContext
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(20);
ctx.currentTime = 10; // seek 10 seconds in
ctx.play();
setTimeout(() => ctx.pause(), 1000); //pause playback after roughly one second.
play()
Start the VideoContext playing
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
registerCallback(type, func)
Register a callback to listen to one of the events defined in VideoContext.EVENTS
Parameters:
Name | Type | Description |
---|---|---|
type |
String | the event to register against. |
func |
function | the callback to register. |
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
ctx.registerCallback(VideoContext.EVENTS.STALLED, () => console.log("Playback stalled"));
ctx.registerCallback(VideoContext.EVENTS.UPDATE, () => console.log("new frame"));
ctx.registerCallback(VideoContext.EVENTS.ENDED, () => console.log("Playback ended"));
registerTimelineCallback(time, func, ordering)
Register a callback to happen at a specific point in time.
Parameters:
Name | Type | Default | Description |
---|---|---|---|
time |
number | the time at which to trigger the callback. |
|
func |
function | the callback to register. |
|
ordering |
number | 0 | the order in which to call the callback if more than one is registered for the same time. |
- Source:
reset()
Destroy all nodes in the graph and reset the timeline. After calling this any created nodes will be unusable.
- Source:
snapshot()
Get a JS Object containing the state of the VideoContext instance and all the created nodes.
- Source:
transition(definition) → {TransitionNode}
Create a new transition node.
Transistion nodes are a type of effect node which have parameters which can be changed as events on the timeline.
For example a transition node which cross-fades between two videos could have a "mix" property which sets the progress through the transistion. Rather than having to write your own code to adjust this property at specfic points in time a transition node has a "transition" function which takes a startTime, stopTime, targetValue, and a propertyName (which will be "mix"). This will linearly interpolate the property from the curernt value to tragetValue between the startTime and stopTime.
Parameters:
Name | Type | Description |
---|---|---|
definition |
Object | this is an object defining the shaders, inputs, and properties of the transition node to create. |
- Source:
Returns:
A new transition node created from the passed definition.
- Type
- TransitionNode
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
//A simple cross-fade node definition which cross-fades between two videos based on the mix property.
var crossfadeDefinition = {
vertexShader : "\
attribute vec2 a_position;\
attribute vec2 a_texCoord;\
varying vec2 v_texCoord;\
void main() {\
gl_Position = vec4(vec2(2.0,2.0)*a_position-vec2(1.0, 1.0), 0.0, 1.0);\
v_texCoord = a_texCoord;\
}",
fragmentShader : "\
precision mediump float;\
uniform sampler2D u_image_a;\
uniform sampler2D u_image_b;\
uniform float mix;\
varying vec2 v_texCoord;\
varying float v_mix;\
void main(){\
vec4 color_a = texture2D(u_image_a, v_texCoord);\
vec4 color_b = texture2D(u_image_b, v_texCoord);\
color_a[0] *= mix;\
color_a[1] *= mix;\
color_a[2] *= mix;\
color_a[3] *= mix;\
color_b[0] *= (1.0 - mix);\
color_b[1] *= (1.0 - mix);\
color_b[2] *= (1.0 - mix);\
color_b[3] *= (1.0 - mix);\
gl_FragColor = color_a + color_b;\
}",
properties:{
"mix":{type:"uniform", value:0.0},
},
inputs:["u_image_a","u_image_b"]
};
//Create the node, passing in the definition.
var transitionNode = videoCtx.transition(crossfadeDefinition);
//create two videos which will overlap by two seconds
var videoNode1 = ctx.video("video1.mp4");
videoNode1.play(0);
videoNode1.stop(10);
var videoNode2 = ctx.video("video2.mp4");
videoNode2.play(8);
videoNode2.stop(18);
//Connect the nodes to the transistion node.
videoNode1.connect(transitionNode);
videoNode2.connect(transitionNode);
//Set-up a transition which happens at the crossover point of the playback of the two videos
transitionNode.transition(8,10,1.0,"mix");
//Connect the transition node to the output
transitionNode.connect(ctx.destination);
//start playback
ctx.play();
unregisterCallback(func)
Remove a previously registered callback
Parameters:
Name | Type | Description |
---|---|---|
func |
function | the callback to remove. |
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
//the callback
var updateCallback = () => console.log("new frame");
//register the callback
ctx.registerCallback(VideoContext.EVENTS.UPDATE, updateCallback);
//then unregister it
ctx.unregisterCallback(updateCallback);
unregisterTimelineCallback(func)
Unregister a callback which happens at a specific point in time.
Parameters:
Name | Type | Description |
---|---|---|
func |
function | the callback to unregister. |
- Source:
update(dt)
This allows manual calling of the update loop of the videoContext.
Parameters:
Name | Type | Description |
---|---|---|
dt |
Number | The difference in seconds between this and the previous calling of update. |
- Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement, undefined, {"manualUpdate" : true});
var previousTime;
function update(time){
if (previousTime === undefined) previousTime = time;
var dt = (time - previousTime)/1000;
ctx.update(dt);
previousTime = time;
requestAnimationFrame(update);
}
update();
video(src, sourceOffsetopt, preloadTimeopt, videoElementAttributesopt) → {VideoNode}
Create a new node representing a video source
Parameters:
Name | Type | Attributes | Default | Description |
---|---|---|---|---|
src |
string | HTMLVideoElement | MediaStream | The URL or video element to create the video from. |
||
sourceOffset |
number |
<optional> |
0 | Offset into the start of the source video to start playing from. |
preloadTime |
number |
<optional> |
4 | How many seconds before the video is to be played to start loading it. |
videoElementAttributes |
Object |
<optional> |
A dictionary of attributes to map onto the underlying video element. |
- Source:
Returns:
A new video node.
- Type
- VideoNode
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("bigbuckbunny.mp4");
Type Definitions
STATE
Video Context States
Type:
- Object
Properties:
Name | Type | Description |
---|---|---|
STATE.PLAYING |
number | All sources are active |
STATE.PAUSED |
number | All sources are paused |
STATE.STALLED |
number | One or more sources is unable to play |
STATE.ENDED |
number | All sources have finished playing |
STATE.BROKEN |
number | The render graph is in a broken state |
- Source:
STATE
Video Context Events
Type:
- Object
Properties:
Name | Type | Description |
---|---|---|
STATE.UPDATE |
string | Called any time a frame is rendered to the screen. |
STATE.STALLED |
string | happens anytime the playback is stopped due to buffer starvation for playing assets. |
STATE.ENDED |
string | Called once plackback has finished (i.e ctx.currentTime == ctx.duration). |
STATE.CONTENT |
string | Called at the start of a time region where there is content playing out of one or more sourceNodes. |
STATE.NOCONTENT |
number | Called at the start of any time region where the VideoContext is still playing, but there are currently no active playing sources. |
- Source: