Table of Contents
Introduction
S3 is short for Simple Storage Service. It’s Amazon’s main storage service in its AWS Cloud offerings. People who build software can employ S3 to save and get files as objects for their apps that run in the cloud. Amazon offers tools for various popular programming languages. These tools let us add S3 storage capabilities to our apps.
In this article, we will look at how we can easily work with S3 to put and fetch files using the AWS SDK for ASP.NET Core. We’ll use a basic Form that lets users upload a file to demonstrate upload.
To make things simple, we shall reuse the components we’ve used previously for developing a simple Form with local File upload in ASP.NET Core.
Getting Started – Creating IAM Policy for S3 Bucket Actions
AWS S3 is a fully managed object storage service with a very high availability and low latency. The things uploaded into S3 are stored as objects and are placed in spaces called as buckets. In the AWS Cloud, we use IAM policies and permissions to control who can access what.
This makes sure apps and parts of apps only get the access they really need. In our scenario, we will set up the correct access rules for our S3 bucket. Let us assume that we have already made an S3 bucket, and what we want to do is write stuff to a certain folder inside it.
The IAM Policy for this looks like below:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::myBucket"
]
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Action": "s3:*Object",
"Resource": [
"arn:aws:s3:::myBucket/assets/*"
]
}
]
}
What this policy does? It provides access for the resource to list all the contents of the bucket “myBucket” and also allow all “Object level” operations under the path “myBucket/assets/”.
Keep in mind that there is no concept of folders in an S3 bucket, although you can organize your objects inside folders in the Console, all the folders and sub folders are a part of the fully qualified part of the object key.
When we deploy our application in any AWS Compute service such as an AWS Lambda Function, We’ll attach this policy to the Lambda execution Role.
To test in our local machine, we will use the AWS CLI which impersonates a developer profile and lets us access the cloud resources while working on our development machines.
We’ll use the AWS Toolkit for VisualStudio extension, which makes things further easy on development in VisualStudio. This is available for both VisualStudio 2019 and VisualStudio Code.
Once we’ve installed the AWS CLI, we can setup the developer profile which the applications use while running on the machine.
For this, we require the AWS Security Credentials which is a combination of the AccessKey and AccessSecret. If you have an AWS account, you can get these under IAM -> Security Credentials
Working with AWS S3 .NET Core Example
To upload a file to an AWS S3 Bucket using ASP.NET Core via code, we follow the following 3 steps –
- Installing the AWSSDK.S3 nuget
- Implementing the IStorageService interface for S3 Upload
- Wiring the implementation – Upload View Form and the Controller
As mentioned before, we’re going to reuse our previous implementation of a Local File Upload. We have previously designed an interface which we’ve implemented for uploading files and storing in local directory.
The interface looks like this:
namespace ReaderStore.WebApp.Providers.Services
{
public interface IStorageService
{
Task<string> AddItem(IFormFile file, string readerName);
Task<byte[]> GetItem(string objectKey);
string GeneratePreSignedURL(string objectKey);
}
}
We’ll implement this interface and provide functionality to upload to the aforementioned S3 bucket instead of the local server directory.
namespace ReaderStore.WebApp.Providers.Services
{
public class S3StorageService : IStorageService
{
public async Task<string> AddItem(IFormFile file, string readerName)
{
// implementation for S3 bucket
}
}
}
Installing the AWSSDK.S3 package
First, we need to install the AWS SDK which contains the necessary libraries to make this happen.
> dotnet add package AWSSDK.S3 --version 3.5.5.2
Once this is installed, let’s fill our S3StorageService.AddItem() method. We’ll get the uploaded file as a stream and a readerName, which we’ll use as a subdirectory inside myBucket/assets/.
Implementing IStorageService for S3 Upload
First, we’ll instantiate an S3Client which helps us with the upload operation to S3 bucket. We’ll also declare some constants for the bucket name and the constant path.
private readonly AmazonS3Client s3Client;
private const string BUCKET_NAME = "myBucket";
private const string FOLDER_NAME = "assets";
private const double DURATION = 24;
public S3StorageService()
{
s3Client = new AmazonS3Client(RegionEndpoint.USWest2);
}
While creating the S3Client, we need to provide the Region where the S3 bucket is created and is available.
Inside the AddItem method, we’ll first read through the fileStream and then create a PutObjectRequest, which represents an object that needs to be put into the S3 bucket.
public async Task<string> AddItem(IFormFile file, string readerName)
{
string fileName = file.FileName;
string objectKey = $"{FOLDER_NAME}/{readerName}/{fileName}";
using (Stream fileToUpload = file.OpenReadStream())
{
var putObjectRequest = new PutObjectRequest();
putObjectRequest.BucketName = BUCKET_NAME;
putObjectRequest.Key = objectKey;
putObjectRequest.InputStream = fileToUpload;
putObjectRequest.ContentType = file.ContentType;
var response = await s3Client.PutObjectAsync(putObjectRequest);
return GeneratePreSignedURL(objectKey);
}
}
Along with the file stream, we’ll also provide the PutObjectRequest with the file metadata such as the ContentType of the file, the name and the filename with which it shall be stored inside the bucket. The PutObjectAsync() processes this PutObjectRequest and commits the file onto the storage. It returns a PutObjectResponse object, which contains the reponse data.
But it’d be better if we get a URI of the file that has been written onto the S3 bucket right? For this, we need to make another call to the S3Client to fetch the uploaded file data, which is the GeneratePreSignedURL() method returns.
public string GeneratePreSignedURL(string objectKey)
{
var request = new GetPreSignedUrlRequest
{
BucketName = BUCKET_NAME,
Key = objectKey,
Verb = HttpVerb.GET,
Expires = DateTime.UtcNow.AddHours(DURATION)
};
string url = s3Client.GetPreSignedURL(request);
return url;
}
We can also fetch an object from Amazon S3 using the SDK via the GetObjectAsync method. To this method we pass the bucket name and the object key we want to read. The result is a GetObjectResponse using which we can get the bytes of the file we have read. The method implementation looks like below:
public async Task<byte[]> GetItem(string keyName)
{
GetObjectResponse response = await client.GetObjectAsync(BUCKET_NAME, keyName);
MemoryStream memoryStream = new MemoryStream();
using Stream responseStream = response.ResponseStream
responseStream.CopyTo(memoryStream);
return memoryStream.ToArray();
}
Although this approach works, it is a best practice to return a presigned url for an object key. It is because we can provide the requesting user the object with a time bound access, after which the URL becomes invalid. This is advantageous from a security standpoint.
The complete class looks like below –
public class S3StorageService
{
private readonly AmazonS3Client s3Client;
private const string BUCKET_NAME = "myBucket";
private const string FOLDER_NAME = "assets";
private const double DURATION = 24;
public S3StorageService()
{
s3Client = new AmazonS3Client(RegionEndpoint.USWest2);
}
public async Task<string> AddItem(IFormFile file, string readerName)
{
string fileName = file.FileName;
string objectKey = $"{FOLDER_NAME}/{readerName}/{fileName}";
using (Stream fileToUpload = file.OpenReadStream())
{
var putObjectRequest = new PutObjectRequest();
putObjectRequest.BucketName = BUCKET_NAME;
putObjectRequest.Key = objectKey;
putObjectRequest.InputStream = fileToUpload;
putObjectRequest.ContentType = file.ContentType;
var response = await s3Client.PutObjectAsync(putObjectRequest);
return GeneratePreSignedURL(objectKey);
}
}
public string GeneratePreSignedURL(string objectKey)
{
var request = new GetPreSignedUrlRequest
{
BucketName = BUCKET_NAME,
Key = objectKey,
Verb = HttpVerb.GET,
Expires = DateTime.UtcNow.AddHours(DURATION)
};
string url = s3Client.GetPreSignedURL(request);
return url;
}
public async Task<byte[]> GetItem(string keyName)
{
GetObjectResponse response = await client.GetObjectAsync(BUCKET_NAME, keyName);
MemoryStream memoryStream = new MemoryStream();
using Stream responseStream = response.ResponseStream
responseStream.CopyTo(memoryStream);
return memoryStream.ToArray();
}
}
Finally, we need to put this service to use inside our FileUpload View and Controller, which completes the overall picture. First, we’ll register as a dependency inside our Startup class to be injected whenever an instance of the IStorageService is called.
services.AddSingleton<IStorageService, S3StorageService>();
The other components need not be changed, because they already work with the abstraction IStorageService which we’ve provided our S3 version of implementation for.
Integration – Upload View Form and the Controller
private async Task<ReaderResponseModel> AddReader(ReaderRequestModel model)
{
var res = new ReaderResponseModel();
// magic happens here
// check if model is not empty
if (model != null)
{
// create new entity
var reader = new Reader();
// add non-file attributes
reader.Name = model.Name;
reader.EmailAddress = model.EmailAddress;
// check if any file is uploaded
var work = model.Work;
if (work != null)
{
// calls the S3 implementation of the IStorageService
// writes the uploaded file and returns a presigned url
// of the asset stored under S3 bucket
var fRes = await _storage.AddItem(work, model.Name);
// assign the generated filePath to the
// workPath property in the entity
reader.WorkPath = fRes;
}
// add the created entity to the datastore
// using a Repository class IReadersRepository
// which is registered as a Scoped Service
// in Startup.cs
var created = _repo.AddReader(reader);
// Set the Success flag and generated details
// to show in the View
res.IsSuccess = true;
res.ReaderId = created.Id.ToString();
res.WorkPath = created.WorkPath;
res.RedirectTo = Url.Action("Index");
}
// return the model back to view
// with added changes and flags
return res;
}
Conclusion
We’ve seen and implemented uploading and getting a simple file with Amazon S3 bucket using the AWSSDK.S3 library available for the ASP.NET Core. On top of it, the IFormFile interface provided in the dotnetcore library makes file upload even more simple from the client standpoint.
We’ve done it in a way such that it doesn’t matter what kind of file we’re trying to put into the S3 bucket, it just works fine.
While this setup works just fine if the application puts all the files into a single location, if we are to put the files based on the logged in user session, which is the case for user centric and SaaS applications, we’ll need to tweak this implementation to accommodate variable folder path inside the bucket. The generic IAM policy we created for the purpose doesn’t fit anymore.
We would need to go for a user session based IAM role which is the essence of a role-based resouce access model. We shall dig into this design in another article.