Thursday, 25 April 2019

Cheapest Solution On FTP Over AWS Cloud

Businesses are always helped by the evolving software tools in respect to the increasing demands of the market. Apart from secure working mechanism, it would be the cherry on the top if the cost implications for such softwares are cheap enough. In this post we will go through one of such files upload/download tool/service provided by AWS which is secure and cheapest as compare to other cloud technologies.

FTP (File Transfer Protocol) is a fast and handy way to transfer small/large files over the Internet. At some point, we may have configured an FTP server backed up by block storage, NAS, or a SAN. However, involving this kind of backend storage options requires infrastructure support and can also cost a fair amount of time and money.

Why S3 FTP?
Amazon S3 service is reliable and have user friendly interface. Amazon S3 features during the last edition of re:Invent.

  • Amazon S3 offers an infrastructure that’s “designed for durability of 99.999999999% of objects.”
  • Amazon S3 is designed to provide “99.99% availability of objects over a year.”
  • You pay for exactly what you need with no minimum commitments or up-front fees.
  • With Amazon S3, we can store unlimited data you can store or when you can access it.

S3 FTP : Implementation

1. Using S3 : Object Storage As Filesystem :
Create a S3 bucket that will be used as filesystem, which can be done by AWS console or API.

2. IAM Policy And Role :
Create an IAM Policy and Role to control access into the previously created S3 bucket which also can be done by AWS console or API.

3. FTP Server :
Launch a EC2 instance that will be used for hosting FTP service.

4. Setting Up S3FS On FTP Server :
We will configure S3FS on the FTP server in order to mount the S3 bucket as file system. Here we can follow the below steps to configure the same.

Step-1 :- If you are using a new centos or ubuntu instance. Update the system

  • For CentOS or Red Hat
  •     # yum update all
  • For Ubuntu
  •     # apt-get update

Step-2 :- Install the dependencies.

  • For CentOS or Red Hat
  •     # sudo yum install automake fuse fuse-devel gcc-c++ git libcurl-devel libxml2-devel make openssl-devel
  • For Ubuntu
  •      # sudo apt-get install automake autotools-dev fuse g++ git libcurl4-gnutls-dev libfuse-dev libssl-dev libxml2-dev make pkg-config

Step-3 :- Clone s3fs source code from git.


Step-4 :- Now navigate to source code  directory, and compile and install the code with the following commands:

  • cd s3fs-fuse
  • ./autogen.sh
  • ./configure --prefix=/usr --with-openssl
  • make
  • sudo make install
Step-5 :- Use below command to check where s3fs command is placed in O.S. It will also tell you the installation is ok.

  • which s3fs

5. Configure FTP User Account And Home Directory :
Create a ftptest user account which we will use to authenticate against our FTP service:

  • sudo adduser ftptest 
  • sudo passwd ftptest

Now create the directory structure for the ftptest user account which we will later configure within our FTP service, and for which will be mounted to using the s3fs:

  • sudo mkdir /home/ftptest/ftp
  • sudo chown nfsnobody:nfsnobody /home/ftptest/ftp
  • sudo chmod a-w /home/ftptest/ftp
  • sudo mkdir /home/ftptest/ftp/files
  • sudo chown ftptest:ftptest /home/ftptest/ftp/files

6. Install And Configure vsftpd Over The Server :
Now install and configure our FTP service with the vsftpd package:

  • sudo yum -y install vsftpd

7. Startup S3FS and Mount Directory :
We will configure S3FS to mount the S3 bucket using below commands:

  1. Gather IAM credentials for required S3 bucket access / full S3 access.
  2. Create a  file in /etc with the name passwd-s3fs and paste the access key and secret key in the below format.
    • vi /etc/passwd-s3fs
    •        # Your_accesskey:Your_secretkey
  3. Change the permission of file.
    • sudo chmod 640 /etc/passwd-s3fs
  4. Mount the bucket on the directory created in Step 5
    • sudo s3fs your_bucketname -o use_cache=/tmp -o allow_other -o uid=1001 -o mp_umask=002 -o multireq_max=5 /home/ftptest/ftp/files
    • vi /etc/rc.local
    • sudo s3fs your_bucketname -o use_cache=/tmp -o allow_other -o uid=1001 -o mp_umask=002 -o multireq_max=5 /home/ftptest/ftp/files

Atlast we can test if the S3 bucket is mounted successfully on the desired folder over the server or not.

  •  df -h

Also we can connect and test all the setup through filezilla or any other tool for uploading/downloading the files to AWS S3 using S3FS.
 

With this blog description we have witnessed – how we can leverage the S3FS together with both S3 and FTP to build a file transfer solution! If you have any queries then write us at support@cloud.in

No comments:

Post a Comment

AWS CodeGuru Elevating Code Security

  Security and code quality are paramount in today’s fast-paced software development landscape. As the cornerstone of DevSecOps, Static Appl...