Ask your WordPress questions! Pay money and get answers fast! (more info)

WooCommerce + Amazon S3 using website bandwidth WordPress

  • SOLVED

I'm using WooCommerce with the Amazon S3 plugin to serve digital downloads. It seems that WooCommerce generates a download link that proxies the connection to Amazon S3 and then serves it through the URL of the website with the store on in order to mask the URL of the file the customer is downloading.

This is causing a problem because customer downloads, while being served by Amazon S3, are using my *hosting* account's bandwidth usage (750+GB per month), causing outages due to bandwidth limits being reached.

Naturally the obvious solution is to upgrade the hosting account, but I wondered if there's something I can change to still have masked/secure URLs without my hosting account's bandwidth being consumed by files served from S3?

Answers (2)

2017-07-24

Francisco Javier Carazo Gil answers:

Dan,

Sorry but if the files "is hosted in Amazon S3" although we change the URL or do anything else, the bandwith is going to be seen as consumed by S3.

2017-07-24

Kyle answers:

Dan,

There are quite a few Woo + S3 plugins, can you confirm you are using the first party plugin here https://docs.woocommerce.com/document/amazon-s3-storage/


Dan Davies comments:

Yup, that's the one I'm using :-)


Kyle comments:

There are three file download methods for Woocommerce digital products, "redirect only", "force downloads", and "x-accel-redirect/x-sendfile". Have you tried all three of these or is there a reason you can only use the one you have selected?


Kyle comments:

You can see this setting under WooCommerce > Settings > Products > Downloadable Products.


Dan Davies comments:

I have "force download". My hosting account doesn't support mod_xsendfile, but they've advised me that it wouldn't resolve the issue. Unfortunately I think "redirect" exposes the URL of the file.


Kyle comments:

Double check that 'redirect' doesn't generate a secure URL. I can't view the plugin code, but I'd think Woo would want to do this by default.

It should look "encoded" and likely will have an expiration built in (not visible), which is the best you'll be able to do. So someone could hypothetically share the link, but it'd be dead within 2-5 minutes.

If not, I can write that link, but I'd probably ask to raise the pot a bit to get that done.


Dan Davies comments:

I chucked the site in maintenance mode and changed WooCommerce to "redirect", then re-sent a completed order e-mail which should generate new download links. The download link generate still includes the store URL.


Kyle comments:

Arlight, I started writing the new filter, give me a little while


Kyle comments:

Here is where I'm at https://www.pastiebin.com/597616293b9c1

You'll notice the access keys and secret keys are not there, as well as the bucket and file path. Those you will need to update. The path and bucket can ultimately be updated to where ever your plugin already stores that information, but without access to it I don't know what they would be.

I haven't used the woocommerce_product_file_download_path filter before, so I'm not 100% sure how it will react.

DO NOT post your API Keys in any responses


Dan Davies comments:

I've added this, but the download links don't seem to have changed. Switched between "redirect" and "force downloads", which didn't seem to impact the URLs generated.


Dan Davies comments:

Ah:

Warning: ltrim() expects parameter 1 to be string


Kyle comments:

The object path is the path to your file within your S3 bucket, so when you replace that up top, for testing you might just want to put a static string until you dig up the custom field key


Dan Davies comments:

OK, set them as static for now and get this error:

<Error>
InvalidRequest
<Message>
The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
</Message>

Not sure how to get the keys for the bucket name and object path, because the download file paths are added to WooCommerce products like this:

[amazon_s3 bucket=BUCKET object=OBJECTPATH]


Kyle comments:

I'm in meetings for the next couple hours but can help afterward. It may be best for you to email me, I might need to jump on the site myself.


Dan Davies comments:

Thanks for your help, man. Let me know your e-mail address.


Kyle comments:

ktrusak at gmail dot com