Skip to content
Rajat Kumar edited this page Apr 14, 2021 · 6 revisions

Storage Drive

A Storage Drive which you own. This Application provides integrating different cloud storage options like AWS Storage, Google Storage in market out there and perform all types of operations like store, delete and view.

Table of Contents

  1. Architecture
  2. Working
  3. Frontend
  4. Backend
  5. Installation

Architecture

  • Authentication Flow: Auth 2.0 Access/Refresh Token Implementation.
  • File Upload: To reduce the load on server side when large number of concurrent users are accessing files simultaneously. Data streams are used for both upload and download.
  • Database: Mapping of files from Cloud DB to AWS S3.

Working

  • Authentication Flow: Axios interceptors were used from fontend side to carry the request with refresh_tokens when access_token become invald. JWT has been setup on server side to get the tokens as API.
  • DataStreans : Why did we use data streams? There are multiple libraries out there which use the same underlying concept of streams. Piping the data from one stream to another without any buffering. But they abstract it and add new layer of buffering which totally destroy what we want to achieve.This works in the way that on a server side the file is loaded into the server memory and after it’s fully loaded, you can save it or disk or do anything else with it. This technique is totally a deal breaker as when you want to handle bigger files, let’s say 50MB+ , during the entire time of upload the memory on the server is used. And now imagine 10 users and 1GB files … so 10 GB of ram? Since our application deals large number of concurrent users with high volume of data and want the most reliability from your server, the best option is to not store intermediate files on the Node.js server at all. Instead, we push the same files to the server as soon as we receive them. The file server can be a cloud storage service such as AWS S3.To make this happen, streams are the optimal way of handling the incoming file data. For this busboy is only suitable option as it the basic library on top of every other library is built. Follow these article to learn more.
  1. Choose between Formidable, Busboy, Multer and Multiparty.
  2. Handle large file upload in nodejs.

Same thing is done on frontend side as well to reduce load on the browser we used Stream Saver.

  • Databse: Mapping of files from Cloud DB to AWS S3.

Frontend React + Redux + Tailwind

  • Auth 2.0 used with Access / Refresh tokens.
  • Used hooks with redux for state management.
  • Mobile responsive code.
  • StreamSaver JS used for reduce load on client side downloading.
  • Google OAuth.
  • Follow these steps to setup frontend.

Backend Node + Express + Mongo

  • Object oriented style followed.
  • Dependency Injection used with separation of code in controller / service / database layers.
  • DataStream used to reduce load on server for both upload and download.
  • Error and log management through winstrom.
  • Follow these steps to setup backend.

Installation

Follow these steps to setup a local development server.

Thank You 🤩