How to use with cloud storage like cloudflare stream or s3 #185
Replies: 14 comments 18 replies
-
S3 support is coming, apparently. |
Beta Was this translation helpful? Give feedback.
-
S3 support is on the way, on the meanwhile s3fs could be used to mount an S3 bucket as a local filesystem. It needs only a few commands and it works - though one can argue it's a hacky way to use S3. Proper support for S3 should be anytime in the next 3 months |
Beta Was this translation helpful? Give feedback.
-
I currently have a working implementation of S3 (using Storj.io) with mediacms for a customer. All media/static files are served from Storj and not from my server, which should help a good bit on CPU/Memory constraints. Storj allows you to map a subdomain to their services, so https://media.example.com/static/ and https://media.example.com/site-id/ both work well, and I am able to share static resources among sites. I am actually testing for a multi-site setup, about 22 or so to be exact, all running mediacms via docker-compose with a shared postgresql db, shared redis (I think), and remote workers. I have noticed the code is somewhat hit or miss and is not friendly on bandwidth to/from external storage or remote workers just yet. I was able to get uploads to work faster by combining the chunks locally and then uploading to S3. I'm currently working on remote workers, where the file is obtained from S3, chunked, manipulated, and all modifications/variants are uploaded back to S3. |
Beta Was this translation helpful? Give feedback.
-
Any details on using s3fs or any other mounting that works? I tried mounting using rclone and goofys and neither worked. Not a developer so not sure what to do to get it working. Any guidance would be appreciated. Trying to mount backblaze or s3 at least. Also, any update on s3 integration? See that uploader used by media cms has released some s3 capabilities. thx |
Beta Was this translation helpful? Give feedback.
-
I got the s3fs working but its really slow compared to regular upload. Not sure why? For others wanting to try mounting; I followed these; https://help.backblaze.com/hc/en-us/articles/360047773653-Using-S3FS-with-B2 sudo s3fs xxxxxx /home/mediacms.io/mediacms/media_files -o passwd_file=/etc/passwd-s3fs -o url=https://s3.us-west-001.backblazeb2.com -o allow_other |
Beta Was this translation helpful? Give feedback.
-
This project looks interesting - https://github.com/juicedata/juicefs
|
Beta Was this translation helpful? Give feedback.
-
I decided not to use mediacms and coded my own using uppy.io, tusd server and backblaze. These could be integrated with mediacms and supports both local and cdn storage. |
Beta Was this translation helpful? Give feedback.
-
Hi guys. I just want to know the current status of video storage feature in S3. |
Beta Was this translation helpful? Give feedback.
-
@mgogoulos Any plans on this? |
Beta Was this translation helpful? Give feedback.
-
I finally managed to use S3 and CloudFront with MediaCMS. It is not a final solution, but still I can use a CloudFront to play HLS media. |
Beta Was this translation helpful? Give feedback.
-
I did a PR a long time ago that was messy, still full of print statements
as I was trying to determine what was happening and when.
The upload process isn't ready for S3 buckets. It needs to do local
processing and then dump the final product into S3, otherwise your S3
bandwidth will be double if not tripple because it has to re-encode the
files pulling them again.
If it's still there and someone wants to clean it up, I had it functional
using Django's S3 support and not needed a fs mounted.
…On Thu, Jul 20, 2023, 2:16 PM Ricardo Teixeira ***@***.***> wrote:
I finally managed to use S3 and CloudFront with MediaCMS. It is not a
final solution, but still I can use a CloudFront to play HLS media.
I've mounted the media_files directory to a S3 bucket using s3fs and
created a CloudFront distribution with it as origin. With it I set
MEDIA_URL to CloudFront url.
Some images are not working properly yet, but almost all things are good.
—
Reply to this email directly, view it on GitHub
<#185 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AUJCOTTLJD6WY7GT37B6DTDXRF7Z3ANCNFSM45IMCBPQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
MediaCmS serves files in HLS format and actually at multiple file sizes so
that it can adapt to the internet connection being used. I personally turn
off everything but 720p in the mediacms admin.
Basically the files need to be in this format to be properly streamed, but
to also reduce the number of users downloading your videos should you
disable that feature.
The problem arises with how those files are temporarily stored. Yes, you
can mount an S3 bucket to the media folder on the file system, but the
number of read/writes will be expensive as it reads for each re-encode.
The proper setup would be to cache to local disk without Django and then
the final destination be the Django media storage, which can be set to an
S3 bucket in Django's configuration, no need to use s3fs.
I may revisit this if I have time and try to re-sort out where the files
are stored and when, then commit it. Hopefully I still have my pull request
somewhere and can just clean it up.
MediaCMS needs better debug logging too so that it's easier to trace what's
going on in the background. I had print statements everywhere trying to
sort it out myself which is why the pull request was rejected, but I was
also in a hurry.
…On Mon, Jul 31, 2023, 3:11 PM Ricardo Teixeira ***@***.***> wrote:
I don't know Django nor mediacms source code well, but I can not
understand why the bandwidth problem. Why it has to re-encode the
files? Can you explain?
—
Reply to this email directly, view it on GitHub
<#185 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AUJCOTX6SWCV3TWYIYWHL4DXTAGQDANCNFSM45IMCBPQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Yeah, that makes sense then with Amazon S3. I was using Storj which also
had an S3 bucket option but bills for outbound bandwidth with inbound being
free, which wasn't feasible. I started noticing the cost skyrocketing for
bandwidth rather quickly.
I'm not sure how the speed is with Amazon S3. S3fs actually supports a
local cache that should alleviate the pain problems with bandwidth (if
anything to speed things up), but can fill up storage rather quickly.
A script to manage the s3fs cache in cron may do what you need to do at
scale. If it's just a demo I don't see you uploading too much, but in the
event you have thousands of videos it would matter.
…On Mon, Jul 31, 2023, 4:29 PM Ricardo Teixeira ***@***.***> wrote:
Thank for your explanation. Indeed, a native implementation without
workarounds like s3fs I using would be great.
About the S3 cost to re-encode, I think it will not ocurr the way you
pointed because the archtecture uses both S3 and EC2 in same region and AWS
says that:
"You pay for all bandwidth into and out of Amazon S3, *except* for the
following:
- Data transferred from an Amazon S3 bucket to any AWS service(s)
within the same AWS Region as the S3 bucket (including to a different
account in the same AWS Region)."
So when my EC2 in a private subnet gets a file from S3 it must not be
charged because the file was not passing throgh the publica address. I must
pay only by the requests GET, PUT, HEAD etc.
—
Reply to this email directly, view it on GitHub
<#185 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ASLNIDDWOXPHDBG4FXKS2H3XTAPRZANCNFSM45IMCBPQ>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I relied on s3fs and ultimately abandoned it in favor of just eventual consistency using a cron job that sync's the media files directory to s3. I found this approach to be much faster and reliable than s3fs. Using the MEDIA_URL setting in django, you can point this to be a cloudfront distribution even and you can avoid the need to serve files from the local file system at all once things are sync'd to s3. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
All reactions