How to upload to Amazon S3 using ASP.NET Core

Let's look at how we can upload a simple file to S3 under the pretext of implementing a simple Form containing a file upload, all using ASP.NET Core.

Introduction – What is Amazon S3?

S3 stands for Simple Storage Service, which is the flagship Storage service offered by Amazon under its AWS Cloud services. Developers can use S3 to manage static files for the applications running in AWS Cloud stack.

AWS provides SDK for all the popular programming stacks using which we can integrate and work with S3 Storage service within our applications.

In this article, let’s look at how we can upload a simple file to S3 under the pretext of implementing a simple Form containing a file upload, all using ASP.NET Core.

To make things simple, we shall reuse the components we’ve used previously for developing a simple Form with local File upload in ASP.NET Core.

Setting things up

Before jumping into the implementation, let’s try to understand how things work from a design perspective. Unlike the case with Azure Storage, which is a pretty much straight forward affair, AWS Cloud brings in the concept of IAM policies and resource access permissions for a fine-grained access management.

This helps in building tightly secured applications and components which have the right level of access abstractions.

In our case, we don’t want to get into much of this complexity at the first sight itself, but instead let’s create the right access policy for our S3 storage space, also called as S3 bucket for our application to use.

Let’s assume we already have a S3 bucket created, and our goal is to write to a specific folder inside of the S3 Bucket. The IAM Policy for this looks like below:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "ListObjectsInBucket",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::myBucket"
            ]
        },
        {
            "Sid": "AllObjectActions",
            "Effect": "Allow",
            "Action": "s3:*Object",
            "Resource": [
                "arn:aws:s3:::myBucket/assets/*"
            ]
        }
    ]
}

What this policy does?
Provides access for the resource to list all the contents of the bucket “myBucket” and also allow all “Object level” operations under the path “myBucket/assets/”

We’ll attach this policy to the Lambda resource, under which we’ll have our ASP.NET Core application deployed.

But this is for the application when running in release environment. How do we debug this while developing? For this, we’ll make use of the AWS CLI which impersonates a developer profile and lets us access the cloud resources while working on our development machines.

We’ll use the AWS Toolkit for VisualStudio extension, which makes things further easy on development in VisualStudio. This is available for both VisualStudio 2019 and VisualStudio Code.

Once we’ve installed the AWS CLI, we can setup the developer profile which the applications use while running on the machine. For this, we require the AWS Security Credentials which is a combination of the AccessKey and AccessSecret. If you have an AWS account, you can get these under IAM->Security Credentials

We’re now done with the design and development setup part. Let’s switch to the actual code. Long story short, we’ll get this done in three steps:

  1. Installing the AWSSDK.S3 nuget
  2. Implementing the IStorageService interface for S3 Upload
  3. Wiring the implementation – Upload View Form and the Controller

As mentioned before, we’re going to reuse our previous implementation of a Local File Upload. We have previously designed an interface which we’ve implemented for uploading files and storing in local directory.

The interface looks like this:

namespace ReaderStore.WebApp.Providers.Services
{
    public interface IStorageService
    {
        Task<string> AddItem(IFormFile file, string readerName);
    }
}

We’ll implement this interface and provide functionality to upload to the aforementioned S3 bucket instead of the local server directory.

namespace ReaderStore.WebApp.Providers.Services
{
    public class S3StorageService : IStorageService
    {
        public async Task<string> AddItem(IFormFile file, string readerName)
        {
            // implementation for S3 bucket    
        }
    }
}

1. Installing the AWSSDK.S3 package

First, we need to install the AWS SDK which contains the necessary libraries to make this happen.

> dotnet add package AWSSDK.S3 --version 3.5.5.2

Once this is installed, let’s fill our S3StorageService.AddItem() method. We’ll get the uploaded file as a stream and a readerName, which we’ll use as a subdirectory inside myBucket/assets/.

2. Implementing IStorageService for S3 Upload

First, we’ll instantiate an S3Client which helps us with the upload operation to S3 bucket. We’ll also declare some constants for the bucket name and the constant path.

private readonly AmazonS3Client s3Client;
private const string BUCKET_NAME = "myBucket";
private const string FOLDER_NAME = "assets";
private const double DURATION = 24;
public S3StorageService()
{
    s3Client = new AmazonS3Client(RegionEndpoint.USWest2);
}

While creating the S3Client, we need to provide the Region where the S3 bucket is created and is available.

Inside the AddItem method, we’ll first read through the fileStream and then create a PutObjectRequest, which represents an object that needs to be put into the S3 bucket.

    public async Task<string> AddItem(IFormFile file, string readerName)
    {
        string fileName = file.FileName;
        string objectKey = $"{FOLDER_NAME}/{readerName}/{fileName}";
        using (Stream fileToUpload = file.OpenReadStream())
        {
            var putObjectRequest = new PutObjectRequest(); 
            putObjectRequest.BucketName = BUCKET_NAME; 
            putObjectRequest.Key = objectKey; 
            putObjectRequest.InputStream = fileToUpload; 
            putObjectRequest.ContentType = file.ContentType; 
            
            var response = await s3Client.PutObjectAsync(putObjectRequest); 
            return GeneratePreSignedURL(objectKey);
        }
    }

Along with the file stream, we’ll also provide the PutObjectRequest with the file metadata such as the ContentType of the file, the name and the filename with which it shall be stored inside the bucket. The PutObjectAsync() processes this PutObjectRequest and commits the file onto the storage. It returns a PutObjectResponse object, which contains the reponse data.

But it’d be better if we get a URI of the file that has been written onto the S3 bucket right? For this, we need to make another call to the S3Client to fetch the uploaded file data, which is the GeneratePreSignedURL() method returns.

private string GeneratePreSignedURL(string objectKey)
{
    var request = new GetPreSignedUrlRequest
    {
        BucketName = BUCKET_NAME,
        Key = objectKey,
        Verb = HttpVerb.GET,
        Expires = DateTime.UtcNow.AddHours(DURATION)
    };

    string url = s3Client.GetPreSignedURL(request);
    return url;
}

The complete class looks like below –

public class S3StorageService
{
    private readonly AmazonS3Client s3Client;
    private const string BUCKET_NAME = "myBucket";
    private const string FOLDER_NAME = "assets";
    private const double DURATION = 24;
    public S3StorageService()
    {
        s3Client = new AmazonS3Client(RegionEndpoint.USWest2);
    }

    public async Task<string> AddItem(IFormFile file, string readerName)
    {
        string fileName = file.FileName;
        string objectKey = $"{FOLDER_NAME}/{readerName}/{fileName}";
        using (Stream fileToUpload = file.OpenReadStream())
        {
            var putObjectRequest = new PutObjectRequest(); 
            putObjectRequest.BucketName = BUCKET_NAME; 
            putObjectRequest.Key = objectKey; 
            putObjectRequest.InputStream = fileToUpload; 
            putObjectRequest.ContentType = file.ContentType; 
            
            var response = await s3Client.PutObjectAsync(putObjectRequest); 
            return GeneratePreSignedURL(objectKey);
        }
    }

    private string GeneratePreSignedURL(string objectKey)
    {
        var request = new GetPreSignedUrlRequest
        {
            BucketName = BUCKET_NAME,
            Key = objectKey,
            Verb = HttpVerb.GET,
            Expires = DateTime.UtcNow.AddHours(DURATION)
        };

        string url = s3Client.GetPreSignedURL(request);
        return url;
    }
}

Finally, we need to put this service to use inside our FileUpload View and Controller, which completes the overall picture. First, we’ll register as a dependency inside our Startup class to be injected whenever an instance of the IStorageService is called.

services.AddSingleton<IStorageService, S3StorageService>();

The other components need not be changed, because they already work with the abstraction IStorageService which we’ve provided our S3 version of implementation for.

3. Wiring the implementation – Upload View Form and the Controller

private async Task<ReaderResponseModel> AddReader(ReaderRequestModel model)
{
    var res = new ReaderResponseModel();

    // magic happens here
    // check if model is not empty
    if (model != null)
    {
        // create new entity
        var reader = new Reader();

        // add non-file attributes
        reader.Name = model.Name;
        reader.EmailAddress = model.EmailAddress;

        // check if any file is uploaded
        var work = model.Work;
        if (work != null)
        {
            // calls the S3 implementation of the IStorageService
            // writes the uploaded file and returns a presigned url
            // of the asset stored under S3 bucket
            var fRes = await _storage.AddItem(work, model.Name);
            
            // assign the generated filePath to the 
            // workPath property in the entity
            reader.WorkPath = fRes; 
        }

        // add the created entity to the datastore
        // using a Repository class IReadersRepository
        // which is registered as a Scoped Service
        // in Startup.cs
        var created = _repo.AddReader(reader);

        // Set the Success flag and generated details
        // to show in the View 
        res.IsSuccess = true;
        res.ReaderId = created.Id.ToString();
        res.WorkPath = created.WorkPath;
        res.RedirectTo = Url.Action("Index");
    }

    // return the model back to view
    // with added changes and flags
    return res;
}

Conclusion – Taking it further

We’ve seen and implemented uploading a simple file to Amazon S3 bucket using the AWSSDK.S3 library available for the ASP.NET Core. On top of it, the IFormFile interface provided in the dotnetcore library makes file upload even more simple from the client standpoint.

We’ve done it in a way such that it doesn’t matter what kind of file we’re trying to put into the S3 bucket, it just works fine.

While this setup works just fine if the application puts all the files into a single location, if we are to put the files based on the logged in user session, which is the case for user centric and SaaS applications, we’ll need to tweak this implementation to accommodate variable folder path inside the bucket. The generic IAM policy we created for the purpose doesn’t fit anymore.

We would need to go for a user session based IAM role which is the essence of a role-based resouce access model. We shall dig into this design in another article.

Buy Me A Coffee

Found this article helpful? Please consider supporting!

Sriram Mannava
Sriram Mannava

I'm a full-stack developer and a software enthusiast who likes to play around with cloud and tech stack out of curiosity.

Leave a Reply

Your email address will not be published. Required fields are marked *