Thứ Ba, 14 tháng 2, 2023

Tạo Client Certificate linux

 1. Tạo khóa riêng cho chứng chỉ CA

openssl genrsa 2048 > ca-key.pem

2. Tạo chứng chỉ CA bằng khóa riêng

openssl req -new -x509 -nodes -days 1000 -key ca-key.pem > ca-cert.pem

3. Tạo chứng chỉ ứng dụng khách

openssl req -newkey rsa:2048 -days 1000 -nodes -keyout client-key1.pem > client-req.pem

4. Tạo chứng chỉ ứng dụng khách được ký bới CA ở  bước 2

openssl x509 -req -in client-req.pem -days 1000 -CA ca-cert.pem -CAkey ca-key.pem -set_serial 01 > client-cert1.pem 

5. Chuyển pem thành PKCS #12 (dùng cho JAVA,...)

openssl pkcs12 -export -in client-cert1.pem -inkey client-key1.pem -out client-cert1.pfx


Config client Certificate trong nginx

Thêm các dòng sau trong file config:

ssl_client_certificate /home/ubuntu/openssl/CA/client-cert1.pem;

ssl_verify_client optional;

location ~ ^/api/(path check client Certificate 1|path check client Certificate 2) {

    if ($ssl_client_verify != "SUCCESS") { return 403; }

proxy_pass http://localhost:8080;

proxy_pass_request_headers on;

proxy_set_header X_CUSTOM_HEADER $http_x_custom_header;


    proxy_set_header X-Forwarded-Host $host;

    proxy_set_header X-Forwarded-Server $host;

    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

}



Thứ Ba, 17 tháng 1, 2023

Cài đặt dịch vụ FTP với VSFTPD centos 7

 sudo yum update

sudo yum install vsftpd

sudo systemctl start vsftpd sudo systemctl enable vsftpd


sudo firewall-cmd --zone=public --permanent --add-port=21/tcp sudo firewall-cmd --zone=public --permanent --add-service=ftp sudo firewall-cmd --reload

Thêm vào cuối file: sudo nano /etc/vsftpd/vsftpd.conf

anonymous_enable=NO chroot_local_user=YES allow_writeable_chroot=YES userlist_enable=YES userlist_file=/etc/vsftpd.userlist userlist_deny=NO user_sub_token=$USER local_root=/image/$USER/ftp


sudo systemctl restart vsftpd

sudo adduser testuser sudo passwd testuser

echo "testuser" | sudo tee -a /etc/vsftpd.userlist


sudo mkdir –p /image/testuser/ftp/upload sudo chmod 550 /image/testuser/ftp sudo chmod 750 /image/testuser/ftp/upload sudo chown –R testuser: /image/testuser/ftp

connect dùng lệnh ftp ipserver




dungf lenhj

Thứ Tư, 26 tháng 10, 2022

Thứ Năm, 20 tháng 10, 2022

Thứ Ba, 20 tháng 10, 2020

Webcam Live video streaming with WebSockets using GlassFish

 

WebCam Live Video Streaming with WebSockets.

In this blog I will tell you how to do video live streaming using websockets and using Web-RTC specification for the front end and Glass Fish as server side implemenation of  JSR 356.

Introduction to Websockets in JEE.

    WebSocket is a standard web technology, which simplifies much of the complexity of bidirectional communication and connection management between clients and a server. It maintains a constant connection that can be used to read messages from and write messages to a server and the client at the same time. WebSocket was introduced as part of the HTML 5 initiative defining a JavaScript interface for browsers to use WebSocket to transfer and receive data.

    Here are some of the benefits of WebSocket: 
  • Full duplex client-server communication.
  • Integration with other Java EE technologies. You can inject objects and Enterprise JavaBeans (EJB) by using components such as Contexts and Dependency Injection (CDI).
  • Low-level communication (works on the underlying TCP/IP connection).
  • Low latency communication.


Tech Stat used in Implementation :

Front End :

  • Web-RTC : Specification page says this :  These APIs should enable building applications that can be run inside a browser, requiring no extra downloads or plugins, which allow communication between parties using audio, video and supplementary real-time communication, without having to use intervening servers (unless needed for firewall traversal, or for providing intermediary services).
  • WebSockets : Enable Web applications to maintain bidirectional communications with server-side processes, this specification introduces the WebSocket interface.
  •  Canvas Canvas is a html5 features which is used to draw graphics on the fly on the web page.

Back End :

  •  GlassFish - GlassFish is used for JSR 356 implementation

Front End Development :

Step1 : Enable Media Stream in Chrome for Web-RTC and access the webcam.

Lets start, the first thing that we need to check is we are having the latest version of chrome or version which supports Web-RTC specificaiton support.

Pre Check : In case your chrome version is 18 or less, then you need to enable the media stream in Chrome for Web-RTC. In case you are using the latest or above version then you can skip this step.

To enable the media stream in Chrome, type  : "chrome://flags/"  in new tab and enable the mediastream feature:




Access your webcam :

After doing the above steps(if required), we can access the webcam using the Web-RTC feature of browser.
All we need to do is put the below html code and javscript.




    
      <div>
              <video id="live" width="320" height="240" autoplay="autoplay"
                     style="displayinline;"></video>
             
       </div>



and add the below javascript code :



    
<script type="text/javascript">
              var video = $("#live").get()[0];
              var options = {
                     "video" : true
              };
                                  
              // use the chrome specific GetUserMedia function
              navigator.webkitGetUserMedia(options, function(stream) {
                     video.src = webkitURL.createObjectURL(stream);
              }, function(err) {
                     console.log("Unable to get video stream!")
              })
</script>



With above small piece of HTML and javascript we can access the user's webcam and show the stream in the HTML5 video element. We do this by first requesting access to the webcam by using the getUserMedia function (prefixed with the chrome specific webkit prefix).


 In the callback we pass in, we get access to a stream object. This stream object is the stream from the user's webcam. To show this stream we need to attach it to the video element. The src attribute of the video element allows us to specify an URL to play. 


With another new HTML5 feature we can convert the stream to an URL. This is done by using the URL.CreateObjectURL function (once again prefixed). The result of this function is an URL which we attach to the video element. 

Above is  all it takes to get access to the stream of a user's webcam:




Step2 : Send the stream to the Glassfish server over Websockets.


In this step we want to take the data from the stream, and send it as binary data over a websocket to GlassFish server. 

In theory this sounds simple. We've got a binary stream of video information, so we should be able to just access the bytes and instead of streaming the data to the video element, we instead stream it over a websocket to our remote server. 

In practice though, this doesn't work. The stream object we receive from the getUserMedia function call, doesn't have an option to access it data as a stream. Or better said, this feature is not yet available.So we need to find an alternative.

 For this I  as of now, I can think of just have one option:

  1. Take a snapshot of the current video.
  2. Paint this to the canvas element.
  3. Grab the data from the canvas as an image.
  4. Send the image data over websockets.
So to execute the above steps, we need to do a little bit workaround which will cause lot of extra processing on the client side, as lot of calls will be happening causing lot of data being sent to and from server.

To implement it we need to write the below code.



<div>
              <video id="live" width="320" height="240" autoplay="autoplay"
                     style="displayinline;"></video>
              <canvas width="320" id="canvas" height="240" style="displayinline;"></canvas>
       </div>

<script type="text/javascript">
              var video = $("#live").get()[0];
             var canvas = $("#canvas");
             var ctx = canvas.get()[0].getContext('2d');
              var options = {
                     "video" : true
              };
             
              // use the chrome specific GetUserMedia function
              navigator.webkitGetUserMedia(options, function(stream) {
                     video.src = webkitURL.createObjectURL(stream);
              }, function(err) {
                     console.log("Unable to get video stream!")
              })
             
               timer = setInterval(
            function () {
                ctx.drawImage(video, 0, 0, 320, 240);
            }, 250);
</script>
   



Ok, so in the above code, we have added a timer, which will run every 250 milli seconds and will paint the canvas with the image taken from the video element.




See the screen shot above how it looks like. Obviously you will see a delay on the canvas, which you can tune if you decrease the timer interval, but it will require more resource and we will hitting server more frequently over websocket.


Next Step : Is to capture the image from the canvas in the binary format and then send the binary data over the websocket.

To capture the image data, we will extend the time function and will add the below code to have the image data in binary format.




         timer = setInterval(
            function () {
                ctx.drawImage(video, 0, 0, 320, 240);
                var data = canvas.get()[0].toDataURL('image/jpeg', 1.0);
                newblob = convertToBinary(data);
          }, 250);
   
             
              function convertToBinary (dataURI) {
                  // convert base64 to raw binary data held in a string
                  // doesn't handle URLEncoded DataURIs
                  var byteString = atob(dataURI.split(',')[1]);

                  // separate out the mime component
                  var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]

                  // write the bytes of the string to an ArrayBuffer
                  var ab = new ArrayBuffer(byteString.length);
                  var ia = new Uint8Array(ab);
                  for (var i = 0; i < byteString.length; i++) {
                      ia[i] = byteString.charCodeAt(i);
                  }

                  // write the ArrayBuffer to a blob, and you're done
                  var bb = new Blob([ab]);
                  return bb;
              }
   



The convertToBinary function above copies the content from the current canvas and stores it in a data. A data is a string containing base64 encoded binary data.

We could send this over as a text message and let the serverside decode it, but since websockets also allows us to send binary data, we'll convert this to binary. 

We need to do this in two steps, 


  • Since canvas doesn't allow us (or I don't know how) direct access to the binary data. I used the javascript function dataUriToBlob which I find on the internet which will convert the string into a binary array.
  • Second steps is to do send the binary image to the server using websockets.
Using websockets from Javascript is actually very easy. You just need to specify the websockets url and implement a couple of callback functions. The first thing we need to do is open the connection:



var ws = new WebSocket("wss://www.pradeep.com:8181/WebCamStreaming/livevideo");
                  ws.onopen = function () {
                            console.log("Openened connection to websocket");
                  }



If  websockets connection is successfully, then we need to send the binary image data over WebSocket, so using the below code, we can send the binary image data over websocket. So adding the code to send the binary image over websocket protocol.


        timer = setInterval(
            function () {
                ctx.drawImage(video, 0, 0, 320, 240);
                var data = canvas.get()[0].toDataURL('image/jpeg', 1.0);
                newblob = convertToBinary(data);
                ws.send(newblob);
            }, 250);



Step 3 : Server Side - Receive the binary image over Websockets in binary format.

For the server side, we are using Glassfish for the WebSocket implementation.


  1. To implement the server side code, we need to create a WebSocket endpoint which will listen the request from client on websocket protocol.
  2. Create a dynamic web project/maven in your IDE (eclipse/netbeans) and include the jsr-303 reference jars.
  3. Create a class LiveStream.java in the source folder and the below snippet of code.
             


import java.io.IOException;

import java.nio.ByteBuffer;

import java.util.Collections;

import java.util.HashSet;

import java.util.Set;



import javax.websocket.EncodeException;

import javax.websocket.OnClose;

import javax.websocket.OnMessage;

import javax.websocket.OnOpen;

import javax.websocket.Session;

import javax.websocket.server.ServerEndpoint;
 

 public class LiveStream {


private static final Set<Session> sessions = Collections

                     .synchronizedSet(new HashSet<Session>());



}

Note that the following packages are required: 
  • javax.websocket, which contains the Java EE 7 support for WebSocket.
  • java.io, which is used for read and write operations.
  • java.util, which is used to store the list of connected users (or sessions) as collections. These collections are created as static variables to share them among all the WebSocket instances.








     4.   Add the highlighted code to declare and  map the server endpoint


    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.util.Collections;
    import java.util.HashSet;
    import java.util.Set;

    import javax.websocket.EncodeException;
    import javax.websocket.OnClose;
    import javax.websocket.OnMessage;
    import javax.websocket.OnOpen;
    import javax.websocket.Session;
    import javax.websocket.server.ServerEndpoint;



    @ServerEndpoint("/livevideo")


    public class LiveStream {

          private static final Set<Session> sessions = Collections
                         .synchronizedSet(new HashSet<Session>());}


    The @ServerEndpoint annotation allows you to declare a WebSocket, define its URL mapping,

        5. Define the onMessage action, by putting the below highlighted code,



    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.util.Collections;
    import java.util.HashSet;
    import java.util.Set;

    import javax.websocket.EncodeException;
    import javax.websocket.OnClose;
    import javax.websocket.OnMessage;
    import javax.websocket.OnOpen;
    import javax.websocket.Session;
    import javax.websocket.server.ServerEndpoint;


    @ServerEndpoint("/livevideo")
    public class LiveStream {
          private static final Set<Session> sessions = Collections
                         .synchronizedSet(new HashSet<Session>()); 
             

    @OnMessage

           public void processVideo(byte[] imageData, Session session) {

                  System.out.println("INsite process Video");

                  try {

                         // Wrap a byte array into a buffer

                         ByteBuffer buf = ByteBuffer.wrap(imageData);

                        

                         for(Session session2 : sessions){

                               session2.getBasicRemote().sendBinary(buf);

                         }

                        

                                     

                  } catch (Throwable ioe) {

                         System.out.println("Error sending message " + ioe.getMessage());

                  }

    }
    }
    The @OnMessage annotation is the core of the WebSocket implementation. This annotated method is invoked when the client sends a message to the server. In this case, when the client sends a binary image to the WebSocket server, the latter pushes it to all connected peers.

    .6. To define the the onOpen and onClose actions and the file will look like below



           package com.livestream;

    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.util.Collections;
    import java.util.HashSet;
    import java.util.Set;

    import javax.websocket.EncodeException;
    import javax.websocket.OnClose;
    import javax.websocket.OnMessage;
    import javax.websocket.OnOpen;
    import javax.websocket.Session;
    import javax.websocket.server.ServerEndpoint;

    @ServerEndpoint("/livevideo")
    public class LiveStream {

           private static final Set<Session> sessions = Collections
                         .synchronizedSet(new HashSet<Session>());


           @OnOpen
           public void whenOpening(Session session) throws IOException, EncodeException {
                  session.setMaxBinaryMessageBufferSize(1024*512);
                  sessions.add(session);
          }

           @OnMessage
           public void processVideo(byte[] imageData, Session session) {
                  System.out.println("INsite process Video");
                  try {
                         // Wrap a byte array into a buffer
                         ByteBuffer buf = ByteBuffer.wrap(imageData);
    //                   imageBuffers.add(buf);
                        
                         for(Session session2 : sessions){
                               session2.getBasicRemote().sendBinary(buf);
                         }
                        
                        
                  } catch (Throwable ioe) {
                         System.out.println("Error sending message " + ioe.getMessage());
                  }
           }

           @OnClose
           public void whenClosing(Session session) {
                  System.out.println("Goodbye !");
                  sessions.remove(session);
           }

    }




    The @OnOpen and @OnClose annotations define the lifecycle of the WebSocket. The OnOpen action is invoked when a new connection to the WebSocket server is created. Similarly, the OnClose action is invoked when a connection to the WebSocket server is closed. In this application, the OnOpen action pushes connected user into the session, so that same image is sent to all the connected peers..

    One thing to note in the OnOpen action, we have configured maxBinaryMessageBuffer to 512This enables support for binary message. Our WebSocket can now receive binary messages up to 512KB, since we don't directly stream the data, but send a canvas rendered image the message size is rather large. 512KB however is more then enough for messages sized 640x480. Our live streaming also works great with a resolution of just 320x240, so this should be enough.



    Step 4 : Server Side - Receive the binary image over Websockets in binary format.

    Final Step is to recieve the binary data sent by the GlassFish server in our web application and render it to an img element.

    We do this by setting the javascript onmessage function on our websocket. In the following code, we receive the binary message. Convert this data to an objectURL (see this as a local, temporary URL), and set this value as the source of the image. Once the image is loaded, we revoke the objectURL since it is no longer needed.




    <body>
           <script type="text/javascript">
                  var ws = new WebSocket("wss://www.pradeep.com:8181/WebCamStreaming/livevideo");
                  ws.onmessage = function(msg) {
                         var target = document.getElementById("target");
                         url = window.webkitURL.createObjectURL(msg.data);
                         target.onload = function() {
                               window.webkitURL.revokeObjectURL(url);
                         };
                         target.src = url;
                  }
           </script>
           <div style="visibilityhiddenwidth0height0;">
                  <canvas width="320" id="canvas" height="240"></canvas>
           </div>

           <div>
                  <img id="target" style="displayinline;" />
           </div>
    </body>


    Hit the endpoint and there you go :






    As you've seen we can do much with just the new HTML5 APIs. It's too bad not all are finished and support over browsers is in some cases a bit lacking. But it does offer us nice and powerful features. :)