Sending Images to Tmp Uploads Instead of Server Ruby

How to post-process user images programmatically with Rails & Amazon S3 (including testing)

The trouble

In our platforms, we allow our users to upload their ain images for contour pictures. This results, as you might imagine, in a wide diverseness of image sizing, quality and formats. Nosotros brandish these images in diverse means throughout our platforms.

For the most part, nosotros can avert sizing bug by manually setting sizing on the image tag. But in ane very important place — emails — certain servers ignore our styling and display those images at full size: enormous.

We need a way to reformat user images programmatically. Additionally, since nosotros're going to be messing with the images anyhow, nosotros'd similar to auto-rotate, suit levels and colors, and generally make them as nice and as consistent as we can.

Considerations

  • Our images are stored with Amazon's S3 cloud storage. Fortunately Amazon offers a relatively easy-to-use API for interacting with their services.
  • Because our images are on S3, I idea it would be excellent to have this service equally a Lambda function, triggered when a user uploads a photo. Unfortunately I could non, for the damn life of me, become anything to print in the CloudWatch console (where the logs should appear). After bashing upwardly against this wall for a day, I decided to take it back in-firm.
  • We host on Heroku, which offers a free and simple scheduler to run tasks. Information technology'southward not disquisitional for u.s.a. to accept these images converted immediately upon upload. We can schedule a chore that picks upwardly everything new in the last x minutes, and catechumen it.

The Worker

What's needed at present is a worker nosotros can call every bit frequently equally Heroku will permit united states (10 minutes is the shortest interval).

Gathering the right users

First we'll gather all users that take images that need to be converted. We've been storing user images in a specific design in our S3 bucket that includes a files folder. We can just search for users whose profile pictures Regex matches in files:

                User.where(profilePictureUrl: { '$regex': %r(\/files\/) })              

Your mileage may vary here, search-wise: we use a Mongo database.

Of course, we volition be using a different pattern for processed images. This will only option up those who have uploaded new images since the task last ran. Nosotros'll loop through each of these users and perform the post-obit.

Setting up a temporary file

We'll need somewhere to store the image information we are going to manipulate. Nosotros tin practise that with a tmp folder. We'll use this as a holding place for the image we want to upload to the new S3 location. We'll name information technology every bit nosotros'd like our final prototype to be named. We wanted to simplify and standardize images in our system, so we're using the unique user id equally the prototype proper noun:

                @temp_file_location = "./tmp/#{user.id}.png"              

Getting the raw epitome and saving it locally

At present we'll talk to our S3 bucket and go the user'south raw, giant, unformatted image:

                central = URI.parse(user.profilePictureUrl).path.gsub(%r(\A\/), '') s3 = Aws::S3::Customer.new response = s3.get_object(bucket: ENV['AWS_BUCKET'], central: primal)              

The key code there is taking the URL string that nosotros've saved as the user's profilePictureUrl and chopping off everything that's non the terminate path to the picture.

For instance, http://images.someimages.com/any/12345/epitome.png would return whatsoever/12345/prototype.png from that code. That's exactly what S3 wants from u.s.a. to find the image in our bucket. Here'southward the handy aws-sdk gem working for united states with get_object.

Now we can call response.body.read to go a blob of an paradigm (blob is the right word, though it'south above my pay class to actually understand how images are sent back-and-along across the web). We tin write that blob locally in our tmp folder:

                File.open(@temp_file_location, 'wb') { |file| file.write(response.body.read) }              

If we stop hither, you'll see you tin really open up upwards that file in your temp folder (with the name you set higher up — in our case <user>.png ).

Process the prototype

Now nosotros've got the image downloaded from Amazon, we can do whatever we want to it! ImageMagick is an astonishing tool freely available for everybody.

Nosotros used a pared-down version for Rails called MiniMagick. That gem as well has a smashing API that makes things lickety-divide easy. We don't even have to do anything special to option upwardly the image. The @temp_file_location we used earlier to save the prototype will piece of work fine to bring information technology to MiniMagick's attention:

                epitome = MiniMagick::Paradigm.new(@temp_file_location)              

Here's the settings for our photos, but in that location are tons of options to play with:

                paradigm.combine_options do |img|   img.resize '300x300>'   img.auto_orient   img.auto_level   img.auto_gamma   img.sharpen '0x3'   image.format 'png' end              

combine_options is a handy way to practise a bunch of stuff to an image in one block. When information technology exits, the image is saved once again where it was earlier. (Image formatting can't be done with the img from combine_options.) At present that image file in our temporary folder is all kinds of post-processed!

Upload back to S3 and salve as the user's new contour picture

Now all we take to do is prepare another connection to S3 and brand the upload:

                Aws.config.update(   region: ENV['AWS_REGION'],   credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY']))  s3 = Aws::S3::Resource.new name = File.basename(@temp_file_location) bucket = ENV['AWS_BUCKET'] + '-output' obj = s3.bucket(bucket).object(proper name) obj.upload_file(@temp_file_location, acl: 'public-read')              

By convention with Lambda, auto tasks will transport to a new bucket with the one-time saucepan'due south proper noun plus "-output" appended, so I stuck with that. All formatted user images will be dumped into this bucket. Since nosotros are naming the images past (unique) user ids, nosotros are sure we'll never overwrite one user's picture with another.

Nosotros create a new object with the new file'due south name, in the bucket of our choice, then we upload_file. It has to be public-read if we desire information technology visible without a lot of headache on our clients (you may choose a dissimilar security option).

If that final line returns true (which it volition, if the upload goes smoothly), we can update our user record:

                new_url = "https://s3.amazonaws.com/#{ENV['AWS_BUCKET']}-output/#{File.basename(@temp_file_location)}" user.update(profilePictureUrl: new_url)              

And that's it! If nosotros run this guy, nosotros'll auto-format and resize all user images in the organisation. All the original images volition be in place in their old pattern (and in case anything goes wrong), and all users' links will point to their new, formatted images.

Testing

We couldn't possibly add a new feature to a Rails application without testing, correct? Absolutely. Hither's what our tests for this look like:

                RSpec.depict Scripts::StandardizeImages, type: :service practice   permit!(:user) { User.make!(:pupil, profilePictureUrl: 'https://s3.amazonaws.com/files/some_picture.jpg') }    before practice     stub_request(:get, 'https://s3.amazonaws.com/files/some_picture.jpg')       .with(         headers: {           'Take' => '*/*',           'Accept-Encoding' => 'gzip;q=1.0,deflate;q=0.6,identity;q=0.3',           'Host' => 's3.amazonaws.com',           'User-Agent' => 'Scarlet'         }       )       .to_return(status: 200, body: '', headers: {})     allow_any_instance_of(MiniMagick::Image).to receive(:combine_options).and_return(true)     allow_any_instance_of(Aws::S3::Object).to receive(:upload_file).and_return(true)   end    describe '.telephone call' do     it 'finds all users with non-updated profile pictures, downloads, reformats then uploads new moving picture' exercise       Scripts::StandardizeImages.telephone call        look(user.reload.profilePictureUrl)         .to eq "https://s3.amazonaws.com/#{ENV['AWS_BUCKET']}-output/#{user.to_param}.png"     end   end stop              

If y'all look commencement at the examination itself, you'll run into we are testing that our user'southward new profile picture URL was saved correctly. The residuum of it we don't so much care nigh, since we don't actually want our test downloading annihilation, and we probably don't want to spend the time for our test to be manipulating images.

But of course the code is going to endeavour to talk to Amazon and spin up MiniMagick. Instead, nosotros can stub those calls. Just in case this is new for yous, I'll run through this part.

Stubbing calls

If you aren't mocking calls in your tests, you probably ought to start doing that immediately. All that's required is the Webmock gem. You require it in your rails_helper and that'southward virtually information technology.

When your exam tries to make a call to an external source, y'all'll go a message like this (I've subconscious individual keys and things with …s):

                WebMock::NetConnectNotAllowedError:        Real HTTP connections are disabled. Unregistered asking: GET https://... You tin stub this request with the following snippet: stub_request(:get, "https://...").          with(            headers: {           'Accept'=>'*/*',           'Take-Encoding'=>'',           'Authorization'=>...}).          to_return(status: 200, trunk: "", headers: {})              

Merely copy the stub_request bit and yous're well on your way to stubbing glory. You lot may demand to render something in that torso, depending on what you lot are doing with the external API call.

I found it difficult to get this stubbed response to render something my code would see equally an paradigm, and so I just stubbed the MiniMagick function as well. This works fine because nosotros are non seeing the output in this test anyway. You lot'll have to manually test that the paradigm is getting the proper formatting.

Alternatively, you can use Aws.config[:s3] = { stub_responses: true } in your test initializer or maybe on your rails_helper to stub all S3 requests.

1 terminal annotation: Travis CI

Depending on what options you decide to apply to your image, you may observe that Travis' version of ImageMagick is not the same as yours. I tried lots of things to get Travis using the aforementioned ImageMagick equally I was. In the end, I am stubbing the MiniMagick call, and then it's a moot point. But beware: if you don't stub that function, you may discover your CI failing because it doesn't recognize a newer choice (like intensity).

Thanks for reading!



Learn to code for free. freeCodeCamp'south open source curriculum has helped more than 40,000 people get jobs as developers. Get started

morrisonthear1957.blogspot.com

Source: https://www.freecodecamp.org/news/how-to-post-process-user-images-programmatically-with-rails-amazon-s3-including-testing-c72645536b54/

0 Response to "Sending Images to Tmp Uploads Instead of Server Ruby"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel