Posted by Admin on tháng 10 20, 2020 | No comments
WebCam Live Video Streaming with WebSockets.
In this blog I will tell you how to do video live streaming using websockets and using Web-RTC specification for the front end and Glass Fish as server side implemenation of JSR 356.
Introduction to Websockets in JEE.
WebSocket is a standard web technology, which simplifies much of the complexity of bidirectional communication and connection management between clients and a server. It maintains a constant connection that can be used to read messages from and write messages to a server and the client at the same time. WebSocket was introduced as part of the HTML 5 initiative defining a JavaScript interface for browsers to use WebSocket to transfer and receive data.
Here are some of the benefits of WebSocket:
Full duplex client-server communication.
Integration with other Java EE technologies. You can inject objects and Enterprise JavaBeans (EJB) by using components such as Contexts and Dependency Injection (CDI).
Low-level communication (works on the underlying TCP/IP connection).
Low latency communication.
Tech Stat used in Implementation :
Front End :
Web-RTC : Specification page says this : These APIs should enable building applications that can be run inside a browser, requiring no extra downloads or plugins, which allow communication between parties using audio, video and supplementary real-time communication, without having to use intervening servers (unless needed for firewall traversal, or for providing intermediary services).
WebSockets : Enable Web applications to maintain bidirectional communications with server-side processes, this specification introduces the WebSocket interface.
Canvas : Canvas is a html5 features which is used to draw graphics on the fly on the web page.
Back End :
GlassFish - GlassFish is used for JSR 356 implementation
Front End Development :
Step1 : Enable Media Stream in Chrome for Web-RTC and access the webcam.
Lets start, the first thing that we need to check is we are having the latest version of chrome or version which supports Web-RTC specificaiton support.
Pre Check : In case your chrome version is 18 or less, then you need to enable the media stream in Chrome for Web-RTC. In case you are using the latest or above version then you can skip this step.
To enable the media stream in Chrome, type : "chrome://flags/" in new tab and enable the mediastream feature:
Access your webcam :
After doing the above steps(if required), we can access the webcam using the Web-RTC feature of browser. All we need to do is put the below html code and javscript.
With above small piece of HTML and javascript we can access the user's webcam and show the stream in the HTML5 video element. We do this by first requesting access to the webcam by using the getUserMedia function (prefixed with the chrome specific webkit prefix). In the callback we pass in, we get access to a stream object. This stream object is the stream from the user's webcam. To show this stream we need to attach it to the video element. The src attribute of the video element allows us to specify an URL to play. With another new HTML5 feature we can convert the stream to an URL. This is done by using the URL.CreateObjectURL function (once again prefixed). The result of this function is an URL which we attach to the video element. Above is all it takes to get access to the stream of a user's webcam:
Step2 : Send the stream to the Glassfish server over Websockets.
In this step we want to take the data from the stream, and send it as binary data over a websocket to GlassFish server. In theory this sounds simple. We've got a binary stream of video information, so we should be able to just access the bytes and instead of streaming the data to the video element, we instead stream it over a websocket to our remote server. In practice though, this doesn't work. The stream object we receive from the getUserMedia function call, doesn't have an option to access it data as a stream. Or better said, this feature is not yet available.So we need to find an alternative. For this I as of now, I can think of just have one option:
Take a snapshot of the current video.
Paint this to the canvas element.
Grab the data from the canvas as an image.
Send the image data over websockets.
So to execute the above steps, we need to do a little bit workaround which will cause lot of extra processing on the client side, as lot of calls will be happening causing lot of data being sent to and from server. To implement it we need to write the below code.
Ok, so in the above code, we have added a timer, which will run every 250 milli seconds and will paint the canvas with the image taken from the video element.
See the screen shot above how it looks like. Obviously you will see a delay on the canvas, which you can tune if you decrease the timer interval, but it will require more resource and we will hitting server more frequently over websocket. Next Step : Is to capture the image from the canvas in the binary format and then send the binary data over the websocket. To capture the image data, we will extend the time function and will add the below code to have the image data in binary format.
timer = setInterval(
function () {
ctx.drawImage(video, 0, 0, 320, 240);
var data = canvas.get()[0].toDataURL('image/jpeg', 1.0);
newblob = convertToBinary(data);
}, 250);
function convertToBinary (dataURI) {
// convert base64 to raw binary data held in a string
// doesn't handle URLEncoded DataURIs
var byteString = atob(dataURI.split(',')[1]);
// separate out the mime component
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]
// write the bytes of the string to an ArrayBuffer
var ab = new ArrayBuffer(byteString.length);
var ia = new Uint8Array(ab);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
// write the ArrayBuffer to a blob, and you're done
var bb = new Blob([ab]);
return bb;
}
The convertToBinaryfunction above copies the content from the current canvas and stores it in a data. A data is a string containing base64 encoded binary data. We could send this over as a text message and let the serverside decode it, but since websockets also allows us to send binary data, we'll convert this to binary. We need to do this in two steps,
Since canvas doesn't allow us (or I don't know how) direct access to the binary data. I used the javascript function dataUriToBlob which I find on the internet which will convert the string into a binary array.
Second steps is to do send the binary image to the server using websockets.
Using websockets from Javascript is actually very easy. You just need to specify the websockets url and implement a couple of callback functions. The first thing we need to do is open the connection:
var ws = new WebSocket("wss://www.pradeep.com:8181/WebCamStreaming/livevideo");
ws.onopen = function () {
console.log("Openened connection to websocket");
}
If websockets connection is successfully, then we need to send the binary image data over WebSocket, so using the below code, we can send the binary image data over websocket. So adding the code to send the binary image over websocket protocol.
timer = setInterval(
function () {
ctx.drawImage(video, 0, 0, 320, 240);
var data = canvas.get()[0].toDataURL('image/jpeg', 1.0);
newblob = convertToBinary(data);
ws.send(newblob);
}, 250);
Step 3 : Server Side - Receive the binary image over Websockets in binary format.
For the server side, we are using Glassfish for the WebSocket implementation.
To implement the server side code, we need to create a WebSocket endpoint which will listen the request from client on websocket protocol.
Create a dynamic web project/maven in your IDE (eclipse/netbeans) and include the jsr-303 reference jars.
Create a class LiveStream.java in the source folder and the below snippet of code.
javax.websocket, which contains the Java EE 7 support for WebSocket.
java.io, which is used for read and write operations.
java.util, which is used to store the list of connected users (or sessions) as collections. These collections are created as static variables to share them among all the WebSocket instances.
4. Add the highlighted code to declare and map the server endpoint
The @ServerEndpoint annotation allows you to declare a WebSocket, define its URL mapping, 5. Define the onMessage action, by putting the below highlighted code,
import java.io.IOException;
import java.nio.ByteBuffer;
importjava.util.Collections;
import java.util.HashSet;
import java.util.Set;
import javax.websocket.EncodeException;
import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;
@ServerEndpoint("/livevideo") public class LiveStream {
} The @OnMessage annotation is the core of the WebSocket implementation. This annotated method is invoked when the client sends a message to the server. In this case, when the client sends a binary image to the WebSocket server, the latter pushes it to all connected peers.
.6. To define the the onOpen and onClose actions and the file will look like below
The @OnOpen and @OnClose annotations define the lifecycle of the WebSocket. The OnOpen action is invoked when a new connection to the WebSocket server is created. Similarly, the OnClose action is invoked when a connection to the WebSocket server is closed. In this application, the OnOpen action pushes connected user into the session, so that same image is sent to all the connected peers.. One thing to note in the OnOpen action, we have configured maxBinaryMessageBuffer to 512. This enables support for binary message. Our WebSocket can now receive binary messages up to 512KB, since we don't directly stream the data, but send a canvas rendered image the message size is rather large. 512KB however is more then enough for messages sized 640x480. Our live streaming also works great with a resolution of just 320x240, so this should be enough.
Step 4 : Server Side - Receive the binary image over Websockets in binary format.
Final Step is to recieve the binary data sent by the GlassFish server in our web application and render it to an img element. We do this by setting the javascript onmessage function on our websocket. In the following code, we receive the binary message. Convert this data to an objectURL (see this as a local, temporary URL), and set this value as the source of the image. Once the image is loaded, we revoke the objectURL since it is no longer needed.
<body>
<scripttype="text/javascript">
var ws = new WebSocket("wss://www.pradeep.com:8181/WebCamStreaming/livevideo");
As you've seen we can do much with just the new HTML5 APIs. It's too bad not all are finished and support over browsers is in some cases a bit lacking. But it does offer us nice and powerful features. :)
0 nhận xét:
Đăng nhận xét