スキップしてメイン コンテンツに移動

[Node.js][TypeScript] Play with WebRTC

Intro

Finally, I could try WebRTC.
I used the code what I had wrote last time.
[Nest.js] Use WebSocket with ws

First, I accessed Camera and Mic from Web browser.
After that, I opened two client pages and connected them by WebSocket and WebRTC.

I referred MDN sample.
samples-server/s/webrtc-from-chat at master · mdn/samples-server · GitHub
And I also referred this.
Building a WebRTC video broadcast using Javascript

Access Camera and Mic

Adapter.js

According to MDN documents, I should install Adapter.js for interoperability between Web browsers. WebRTC API - Web APIs | MDN GitHub - webrtc/adapter: READ ONLY FORK: Shim to insulate apps from spec changes and prefix differences. Latest adapter.js release: But at least in this time, I felt I didn't have to do that. Thus, I didn't install it.

getUserMedia

I could access Camera and Mic by "getUserMedia".

[client-side] index.html


<!DOCTYPE html>
<html>
    <head>
        <title>Index page</title>
        <link rel="stylesheet" type="text/css" href="/css/style.css" />
    </head>
    <body>
        <div id="main-title">Index page</div>
        <video id="local-video" muted>Video stream not available.</video>
        <script src="/js/main.bundle.js"></script>
    </body>
</html>

[client-side] main.ts


import { RtcSample } from "./rtc-sample";
const sample = new RtcSample();
function init() {
    sample.initVideo();
    sample.connectWebSocket();
}
init();

[client-side] rtc-sample.ts


export class RtcSample {
    private webcamStream: MediaStream|null = null;
    constructor() {
    }
    public initVideo(){
        const localVideo = document.getElementById('local-video') as HTMLVideoElement;
        let streaming = false;
        // set video view size
        localVideo.addEventListener('canplay', ev => {
            if (streaming === false) {
                const width = 320;
                const height = localVideo.videoHeight / (localVideo.videoWidth/width);
                localVideo.setAttribute('width', width.toString());
                localVideo.setAttribute('height', height.toString());
                streaming = true;
            }
          }, false);
        // access Camera and Mic
        navigator.mediaDevices.getUserMedia({ video: true, audio: true })
        .then(stream => {
            this.webcamStream = stream;
            localVideo.srcObject = stream;
            localVideo.play();
            streaming = true;
        })
        .catch(function(err) {
            console.error("An error occurred: " + err);
        });
    }
}

After I allowed accessing, the video element started streaming data of Camera and Mic.


I should mute the element to avoid howling.

index.html


...
        <video id="local-video" muted>Video stream not available.</video>
...

Getting started with media devices | WebRTC Taking still photos with WebRTC - Web API | MDN

'getUserMedia' of undefined

When I opened the page with "localhost", I had no problems. But when I opened with IP address, I got an error.
Uncaught TypeError: Cannot read property 'getUserMedia' of undefined
Because when Web browsers opened any URL exclude "http: // localhost" and "https", they blocked accessing Camera and Mic for security. And because I didn't have an external web camera, I used two tabs of same browser to open two pages.

Connect two clients

WebSocket

At least in this time, server-side code didn't any special operations for WebRTC. It just had functions of WebSocket and sent messages to other clients.

[server-side] events.gateway.ts


import { SubscribeMessage, WebSocketGateway, WebSocketServer, WsResponse } from '@nestjs/websockets';
import { Server } from 'ws';

@WebSocketGateway()
export class EventsGateway {
  @WebSocketServer()
  server: Server;
  
  @SubscribeMessage('message')
  handleMessage(client: any, payload: any) {
    this.server.clients.forEach(s => {
      if (s == client) {
        return;
      }
      s.send(JSON.stringify(payload));
    });
  }
}

[client-side] rtc-sample.ts


export class RtcSample{
    private wsConnection: WebSocket|null = null;
    private readonly myHostname: string;
    private webcamStream: MediaStream|null = null;
    
    constructor() {
        this.myHostname = window.location.hostname;    
    }
...
    public connectWebSocket(){
        this.wsConnection = new WebSocket(`ws://${this.myHostname}:3000`, 'json');
        this.wsConnection.onopen = () => {
            console.log('ws opened');
        };
        this.wsConnection.onmessage = (ev: MessageEvent) => {
            console.log(ev.data);
        }
    }
}

Create RTCPeerConnection

I added a button to start connection of RTC.

[client-side] index.html


<!DOCTYPE html>
<html>
    <head>
        <title>Index page</title>
        <link rel="stylesheet" type="text/css" href="/css/style.css" />
    </head>
    <body>
        <div id="main-title">Index page</div>
        <button id="offer-button" onclick="Page.sendOffer()">Offer</button>
        <video id="local-video" muted>Video stream not available.</video>
        <script src="/js/main.bundle.js"></script>
    </body>
</html>

[client-side] main.ts


import { RtcSample } from "./rtc-sample";
const sample = new RtcSample();
function init() {
...
}
export function sendOffer() {
    sample.invite();
}
init();

[client-side] rtc-sample.ts


export class RtcSample{
    private wsConnection: WebSocket|null = null;
    private readonly myHostname: string;
    private webcamStream: MediaStream|null = null;
    private myPeerConnection: RTCPeerConnection|null = null;
...
    public invite() {
      this.myPeerConnection = this.createPeerConnection();
      
      this.webcamStream.getTracks().forEach(
        track => {
          this.myPeerConnection.addTrack(track, this.webcamStream);
        }
      );
    }
    private createPeerConnection(): RTCPeerConnection {
        const newPeerConnection = new RTCPeerConnection({
          iceServers: [{
              urls: `stun:stun.l.google.com:19302`,  // A STUN server              
            }]
        });
        // add events
        // to avoid 'this' becomes undefined in event handlers, I added RTCSample or RTCPeerConnection.
        newPeerConnection.onicecandidate = (ev) => this.handleICECandidateEvent(ev, this);
        newPeerConnection.oniceconnectionstatechange = (ev) => this.handleICEConnectionStateChangeEvent(ev, newPeerConnection);
        newPeerConnection.onsignalingstatechange = (ev) => this.handleSignalingStateChangeEvent(ev, newPeerConnection);
        newPeerConnection.onnegotiationneeded = () => this.handleNegotiationNeededEvent(newPeerConnection);
        newPeerConnection.ontrack = this.handleTrackEvent;
        return newPeerConnection;
    }
}

These samples skipped null-check to make them simple. RTCPeerConnection - Web APIs | MDN Getting started with peer connections | WebRTC In this sample, I didn't need setting STUN server because I used two tabs of Chrome because of 'getUserMedia'.

Events

After RTCPeerConnection was instantiated and tracks what were gotten from the MediaStream were added, the event handlers would be called. The order of execution was as follows.

ClientA (called "invite()" and sent offer to ClientB)

  • onnegotiationneeded
  • onsignalingstatechange (became "have-local-offer")
  • onicecandidate
  • oniceconnectionstatechange( became "checking")
  • handleVideoAnswerMsg
  • handleNewICECandidateMsg
  • oniceconnectionstatechange (became "connected")
  • onsignalingstatechange (became "stable")
  • handleNewICECandidateMsg
  • onicecandidate
"handleVideoAnswerMsg" and "handleNewICECandidateMsg" weren't event handler. It was called by ClientB through the WebSocket.

onnegotiationneeded

The operations of connecting were started from this event. It was occurred when RTCPeerConnection needed session negotiation. In this sample, when the tracks what were gotten from the MediaStream were added, the event was occurred. RTCPeerConnection.onnegotiationneeded - Web APIs | MDN

[client-side] rts-sample.ts


...
  private sendToServer(msg: VideoOffer|Candidate) {
    const message = {
      event: 'message',
      data: msg,
    };
    this.wsConnection.send(JSON.stringify(message));
  }
  private async handleNegotiationNeededEvent(connection: RTCPeerConnection) {
    try {
      const offer = await connection.createOffer();

      if (connection.signalingState != 'stable') {
        return;
      }
      await connection.setLocalDescription(offer);

      this.sendToServer({
        type: 'video-offer',
        sdp: connection.localDescription,
      });
    } catch(err) {
      console.error(err);
    };
  }
...

[client-side] video-offer.ts


export type VideoOffer = {
    type: 'video-offer'|'video-answer',
    sdp: RTCSessionDescription|null,
};

[client-side] candidate.ts


export type Candidate = {
    type: 'new-ice-candidate',
    candidate: RTCIceCandidateInit|null
};

"handleNegotiationNeededEvent" did two things. "createOffer" and "setLocalDescription".

createOffer

It created SDP (Session Description Protocol) offer to create new WebRTC connection. It returned "RTCSessionDescriptionInit" and it would be used by "setLocalDescription". RTCPeerConnection.createOffer() - Web APIs | MDN SDP - MDN Web Docs Glossary: Definitions of Web-related terms | MDN Introduction to WebRTC protocols - Web APIs | MDN

setLocalDescription

"Description"(RTCSessionDescription) had "type" and "sdp". "type" was "offer" or "answer". ClientA's "type" was "offer". RTCPeerConnection.localDescription - Web APIs | MDN RTCSessionDescription - Web APIs | MDN "Description" was set into itself and sent to ClientB through the WebSocket.

Receive the offer(ClientB)

[client-side] rts-sample.ts


...
  public connectWebSocket(){
    this.wsConnection = new WebSocket(`ws://${this.myHostname}:3000`, 'json');
    this.wsConnection.onopen = () => {
      console.log('ws opened');
    };
    this.wsConnection.onmessage = (ev: MessageEvent) => {
      const payload = JSON.parse(ev.data);
      switch(payload.type) {
        case 'video-offer':
          this.handleVideoOfferMsg(payload);
          break;
        case 'video-answer':
          this.handleVideoAnswerMsg(payload);
          break;
        case 'new-ice-candidate':
          this.handleNewICECandidateMsg(payload);
          break;
        default:
          console.error('type was not found');
          break;
      }
    };
  }
...
private async handleVideoOfferMsg(payload: VideoOffer) {
  if (this.myPeerConnection == null) {
    this.myPeerConnection = this.createPeerConnection();
  }
  const remoteDescription = new RTCSessionDescription(payload.sdp);  
  if (this.myPeerConnection.signalingState != 'stable') {
    await Promise.all([
      this.myPeerConnection.setLocalDescription({type: 'rollback'}),
      this.myPeerConnection.setRemoteDescription(desc)
    ]);
    return;
  } else {
    await this.myPeerConnection.setRemoteDescription(remoteDescription);
  }  
  // In this sample, webcamStream wasn't null.
  if (this.webcamStream == null) {
    try {
      this.webcamStream = await navigator.mediaDevices.getUserMedia({audio: true, video: true});
    } catch(err) {
      console.error(err);
      return;
    }
    const localVideo = document.getElementById('local-video') as HTMLVideoElement;
    localVideo.srcObject = this.webcamStream;      
  }
  try {
    this.webcamStream.getTracks().forEach(
      track => this.myPeerConnection.addTrack(track, this.webcamStream));
  } catch(err) {
    console.error(err);
  }
  
  await this.myPeerConnection.setLocalDescription(await this.myPeerConnection.createAnswer());
  
  this.sendToServer({
    type: 'video-answer',
    sdp: this.myPeerConnection.localDescription,
  });
}

This method did five or six things.
  • Because ClientB hadn't instantiated "RTCPeerConnection" yet, it instantiated
  • If "MediaStream(webcamStream)" was null, it also instantiated.
  • "RTCSessionDescription" what was sent by ClientA was set by "setRemoteDescription".
  • Tracks what got from "MediaStream" and added them into "RTCPeerConnection" as same as ClientA.
  • Create Answer and set into "RTCSessionDescription".
  • Send "RTCSessionDescription" back to ClientB.

onicecandidate (ClientA)

[client-side] rts-sample.ts


...
  private handleICECandidateEvent(event: RTCPeerConnectionIceEvent, self: RtcSample) {
    if (event.candidate) {
      self.sendToServer({
        type: 'new-ice-candidate',
        candidate: event.candidate
      });
    }
  }
...

ClientA (and ClientB) needed establishing ICE(Interactive Connectivity Establishment) Server. It made from some of ICE Candidates. ClientA and ClientB needed their ICE Candidates each other. So this event was fired several times. Getting started with peer connections | WebRTC RTCPeerConnection.onicecandidate - Web APIs | MDN Introduction to WebRTC protocols - Web APIs | MDN

Receive the ICE Candidate (ClientB)

[client-side] rts-sample.ts


...
  private async handleNewICECandidateMsg(msg: Candidate) {
    const candidate = new RTCIceCandidate(msg.candidate);
  
    try {
      await this.myPeerConnection.addIceCandidate(candidate)
    } catch(err) {
      console.error(err);
    }
  }
...

This method saved received ICE Candidates.

Receive the answer(ClientA)

[client-side] rts-sample.ts


...
  private async handleVideoAnswerMsg(msg: VideoOffer) {
    const remoteDescription = new RTCSessionDescription(msg.sdp);
    await this.myPeerConnection.setRemoteDescription(remoteDescription)
      .catch(err => console.error(err));
  }
...

This method just added "RTCSessionDescription" what was sent from ClientB. After that, both ClientA and ClientB had LocalDescription and RemoteDescription. After that, ClientB woould start sending ICE Candidates to ClientA.

Connected

After all of them finished, "oniceconnectionstatechange" would be called, and the status became "connected".

[client-side] rts-sample.ts


...
  private handleICEConnectionStateChangeEvent(event: Event, connection: RTCPeerConnection) {
    switch(connection.iceConnectionState) {
      case 'closed':
      case 'failed':
      case 'disconnected':
        this.closeVideoCall();
        break;
    }
  }
...

Full codes (rts-sample.ts)

[client-side] rts-sample.ts


import { VideoOffer } from "./video-offer";
import { Candidate } from "./candidate";

export class RtcSample{
  private wsConnection: WebSocket|null = null;
  private myPeerConnection: RTCPeerConnection|null = null;
  private readonly myHostname: string;
  private webcamStream: MediaStream|null = null;

  constructor() {
    this.myHostname = window.location.hostname;
  }    
  public initVideo(){
      const localVideo = document.getElementById('local-video') as HTMLVideoElement;
      let streaming = false;
      localVideo.addEventListener('canplay', ev => {
          if (streaming === false) {
              const width = 320;
            const height = localVideo.videoHeight / (localVideo.videoWidth/width);          
            localVideo.setAttribute('width', width.toString());
            localVideo.setAttribute('height', height.toString());
            streaming = true;
          }
        }, false);
      navigator.mediaDevices.getUserMedia({ video: true, audio: true })
      .then(stream => {
          this.webcamStream = stream;
          localVideo.srcObject = stream;
          localVideo.play();
          streaming = true;
      })
      .catch(function(err) {
          console.error(`An error occurred: ${err}`);
      });
  }
  public connectWebSocket(){
    this.wsConnection = new WebSocket(`ws://${this.myHostname}:3000`, 'json');
    this.wsConnection.onopen = () => {
      console.log('ws opened');
    };
    this.wsConnection.onmessage = (ev: MessageEvent) => {
      const payload = JSON.parse(ev.data);
      switch(payload.type) {
        case 'video-offer':
          this.handleVideoOfferMsg(payload);
          break;
        case 'video-answer':
          this.handleVideoAnswerMsg(payload);
          break;
        case 'new-ice-candidate':
          this.handleNewICECandidateMsg(payload);
          break;
        default:
          console.error('type was not found');
          break;
      }
    };
  }
  public invite() {
    this.myPeerConnection = this.createPeerConnection();

    if (this.webcamStream == null) {
      console.error('webcam stream was null');
      return;
    }    
    this.webcamStream.getTracks().forEach(
      track => {
        if (this.myPeerConnection == null) {
          console.error('peer connection was null');
          return;
        }
        if (this.webcamStream == null) {
          console.error('webcam stream was null');
          return;
        }
        this.myPeerConnection.addTrack(track, this.webcamStream);
      }
    );
  }
  private createPeerConnection(): RTCPeerConnection {
      const newPeerConnection = new RTCPeerConnection({
        iceServers: [{
            urls: `stun:stun.l.google.com:19302`,  // A STUN server              
          }]
      });
      newPeerConnection.onicecandidate = (ev) => this.handleICECandidateEvent(ev, this);
      newPeerConnection.oniceconnectionstatechange = (ev) => this.handleICEConnectionStateChangeEvent(ev, newPeerConnection);
      newPeerConnection.onsignalingstatechange = (ev) => this.handleSignalingStateChangeEvent(ev, newPeerConnection);
      newPeerConnection.onnegotiationneeded = () => this.handleNegotiationNeededEvent(newPeerConnection);
      newPeerConnection.ontrack = this.handleTrackEvent;
      return newPeerConnection;
  }
  private sendToServer(msg: VideoOffer|Candidate) {
    if (this.wsConnection == null) {
      console.error('ws connection was null');
      return;
    }
    const message = {
      event: 'message',
      data: msg,
    };
    this.wsConnection.send(JSON.stringify(message));
  }
  private async handleNegotiationNeededEvent(connection: RTCPeerConnection) {
    if (connection == null) {
      console.error('connection was null');
      return;
    }

    try {
      const offer = await connection.createOffer();

      if (connection.signalingState != 'stable') {
        console.log("     -- The connection isn't stable yet; postponing...")
        return;
      }
      await connection.setLocalDescription(offer);

    this.sendToServer({
        type: 'video-offer',
        sdp: connection.localDescription,
      });
    } catch(err) {
      console.error(err);
    };
  }
  private handleICECandidateEvent(event: RTCPeerConnectionIceEvent, self: RtcSample) {
    if (event.candidate) {
      console.log(`*** Outgoing ICE candidate: ${event.candidate.candidate}`);
  
      self.sendToServer({
        type: 'new-ice-candidate',
        candidate: event.candidate
      });
    }
  }
  private handleICEConnectionStateChangeEvent(event: Event, connection: RTCPeerConnection) {
    if (connection == null) {
        console.error('rtc connection was null');
        return;
    }
    console.log(`*** ICE connection state changed to ${connection.iceConnectionState}`);
  
    switch(connection.iceConnectionState) {
      case 'closed':
      case 'failed':
      case 'disconnected':
        this.closeVideoCall();
        break;
    }
  }
  private handleSignalingStateChangeEvent(event: Event, connection: RTCPeerConnection) {
    if (connection == null) {
        console.error('rtc connection was null');
        return;
    }
    console.log(`*** WebRTC signaling state changed to: ${connection.signalingState}`);
    switch(connection.signalingState) {
      case 'closed':
        this.closeVideoCall();
        break;
    }
  }
  private handleTrackEvent(event: RTCTrackEvent) {
    const receivedVideo = document.getElementById('received-video') as HTMLVideoElement;
    if (receivedVideo == null) {
        console.error('received-video element was not found');
        return;
    }
    receivedVideo.srcObject = event.streams[0];
    
  }
  public closeVideoCall() {
    const localVideo = document.getElementById('local-video') as HTMLVideoElement;

    if (localVideo == null) {
        console.error('local-video element was not found');
        return;
    }
  
    if (this.myPeerConnection) {
      console.log('--> Closing the peer connection');
  
      this.myPeerConnection.ontrack = null;
      this.myPeerConnection.onicecandidate = null;
      this.myPeerConnection.oniceconnectionstatechange = null;
      this.myPeerConnection.onsignalingstatechange = null;
      this.myPeerConnection.onnegotiationneeded = null;
  
      if (localVideo.srcObject) {
        localVideo.pause();
        (localVideo.srcObject as MediaStream).getTracks().forEach(track => {
          track.stop();
        });
      }  
      this.myPeerConnection.close();
      this.myPeerConnection = null;
      this.webcamStream = null;
    }
  }
  private async handleVideoOfferMsg(payload: VideoOffer) {
    if (payload.sdp == null) {
      console.error('sdp was null');
      return;
    }  
    if (this.myPeerConnection == null) {
      this.myPeerConnection = this.createPeerConnection();
      if (this.myPeerConnection == null) {
        console.error('failed creating Peer connection');
        return;
      }
    }
    
    const remoteDescription = new RTCSessionDescription(payload.sdp);
    
    if (this.myPeerConnection.signalingState != "stable") {
      console.log("  - But the signaling state isn't stable, so triggering rollback");
  
      await Promise.all([
        this.myPeerConnection.setLocalDescription({type: 'rollback'}),
        this.myPeerConnection.setRemoteDescription(remoteDescription)
      ]);
      return;
    } else {
      console.log ('  - Setting remote description');
      await this.myPeerConnection.setRemoteDescription(remoteDescription);
    }
  
    if (this.webcamStream == null) {
      try {
        this.webcamStream = await navigator.mediaDevices.getUserMedia({audio: true, video: true});
      } catch(err) {
        console.error(err);
        return;
      }
      const localVideo = document.getElementById('local-video') as HTMLVideoElement;
      if (localVideo == null) {
        console.log('localVideo was null');
      }
      else{
        localVideo.srcObject = this.webcamStream;
      } 
    }
    try {
      this.webcamStream.getTracks().forEach(
        track => {
          if (this.myPeerConnection != null &&
            this.webcamStream != null) {
            
            this.myPeerConnection.addTrack(track, this.webcamStream);    
          } else {
            console.error('peer connection of webcamstream was null');
          }       
        }
      );
    } catch(err) {
      console.error(err);
    }
    await this.myPeerConnection.setLocalDescription(await this.myPeerConnection.createAnswer());
  
    this.sendToServer({
      type: 'video-answer',
      sdp: this.myPeerConnection.localDescription,
    });
  }
  private async handleVideoAnswerMsg(msg: VideoOffer) {
    if (msg.sdp == null) {
      console.error('sdp was null');
      return;
    }
    if (this.myPeerConnection == null) {
      console.error('peer connection was null');
      return;
    }    
    const remoteDescription = new RTCSessionDescription(msg.sdp);
    await this.myPeerConnection.setRemoteDescription(remoteDescription)
      .catch(err => console.error(err));
  }
  private async handleNewICECandidateMsg(msg: Candidate) {
    if (msg.candidate == null) {
      console.error('candidate was null');
      return;
    }
    if (this.myPeerConnection == null) {
      console.error('peer connection was null');
      return;
    }
    const candidate = new RTCIceCandidate(msg.candidate);  
    console.log(`*** Adding received ICE candidate: ${JSON.stringify(candidate)}`);
    try {
      await this.myPeerConnection.addIceCandidate(candidate)
    } catch(err) {
      console.error(err);
    }
  }
}

コメント

このブログの人気の投稿

[Angular][ASP.NET Core] Upload chunked files

Intro I wanted to send files to Web application (made by ASP.NET Core). If the file size had been small, I didn't need do any special things. But when I tried to send a large file, the error was occurred by ASP.NET Core's limitation. Though I could change the settings, but I didn't want to do that, because I hadn't known the file sizes what would been actually using. So I splitted the data into chunks first, and sent them. After receiving all chunks, I merged them into one file. There might be some libraries or APIs (ex. Stream API) what did them automatically, but I couldn't find them. What I did [ASP.NET Core] Make CORS enabled [Angular] Split a large file into chunks [Angular][ASP.NET Core] Send and receive data as form data [ASP.NET Core] Merge chunks into one file [ASP.NET Core] Make CORS enabled Because the client side application(Angular) and the server side application(ASP.NET Core) had been separated, I had to make CORS(Cross-Origin Requests)

[Nest.js] Use WebSocket with ws

Intro Until last time , I had used node-web-rtc to try WebRTC. But because the example was a little complicated for I understood the core functions of using WebRTC. So I look for other frameworks or libraries. PeerJS is a famous library for WebRTC. peers/peerjs: Peer-to-peer data in the browser. - GitHub peers/peerjs-server: Server for PeerJS - GitHub PeerJS - Simple peer-to-peer with WebRTC A problem is I don't know how to integrate to the Nest.js project. I couldn't find examples. So I don't choose at least this time. What shall I choose? According MDN, WebRTC doesn't specify strictly what technology is used on server application for connecting two devices. Signaling and video calling - Web APIs | MDN But in many examples include MDN's one use WebSocket. samples-server/s/webrtc-from-chat at master · mdn/samples-server · GitHub So I try WebSocket in the Nest.js project. Use WebSocket in a Nest.js project Nest.js has a function for using We

[Nest.js] Show static files

Intro I wanted to use Nest.js and WebRTC(node-webrtc). NestJS - A progressive Node.js framework Documentation | NestJS - A progressive Node.js framework And because I wanted to try with simple page(not use JavaScript frameworks), I added static HTML, CSS, JavaScript into a Nest.js project. Prepare Install First, I installed @nestjs/cli. First steps | NestJS - A progressive Node.js framework As same as last time , I couldn't do global install because I had used Volta. But I could installed by volta. volta install @nestjs/cli Create project nest new nest-web-rtc-sample volta pin node@12 Run npm start After doing "npm start", I could getting "Hello World!" from http://localhost:3000. Add static files I could add static files by two ways. @nestjs/serve-static First one of them was using "serve-static". Serve Static | NestJS - A progressive Node.js framework npm install --save @nestjs/serve-static And I needed adding a module into app.modu