Reputation: 943
I am struggling to find out how to set the limit of the storage that each user can upload to my apps storage.
I found method online Storage.storageLimitInBytes
method, but I don't see this method even be mentioned in Firebase docs, let alone instructions on how to set it.
In general, how do startups monitor how many times user upload images, would they have a field in users document such as amountOfImagesUploaded:
and everytime user uploads image I would increment that count and this way I could see who abuse the storage that way.
Or would I have to similar document that tracks users uploads per day and when the count reaches 100 or something then take action on that user.
I would really appreciate your help regarding this issue that I am facing.
Upvotes: 6
Views: 1755
Reputation: 3597
Limits in Cloud Storage for Firebase security rules apply to each file/object separately, they don't apply to an entire operation. You can limit what a user can upload through Firebase Storage's security rules. For example, this (from the linked docs) is a way to limit the size of uploaded files:
service firebase.storage {
match /b/<your-firebase-storage-bucket>/o {
match /images/{imageId} { // Only allow uploads of any image file that's less than 5MB
allow write: if request.resource.size < 5 * 1024 * 1024 && request.resource.contentType.matches('image/.*');
} } }
But there is currently no way in these rules to limit the number of files a user can upload. Some options to consider:
If you hardcode the names of the files that the user uploads (which also implies you'll limit the number of files they can upload), and create a folder for the files for each specific user, you can determine the sum of all files in a user's folder, and thus limit on the sum in that way.
For example : If you fix file names and limit the allowed file names to be numbered 1..5, the user can only ever have five files in storage:
match /public/{userId}/{imageId} {
allow write: if imageId.matches("[1-5]\.txt");
}
Alternatively, you can ZIP all files together on the client, and then upload the resulting archive. In that case, the security rules can enforce the maximum size of that file.
And of course you can include client-side JavaScript code to check the maximum size of the combined files in both of these cases. A malicious user can bypass this JavaScript easily, but most users aren't malicious and will thank you for saving their bandwidth by preventing the upload that will be rejected anyway.
You can also use a HTTPS Cloud Function as your upload target, and then only pass the files onto Cloud Storage if they meet your requirements. Alternatively you can use a Cloud Function that triggers upon the upload from the user, and validates the files for that user after the change. For example : You would have to upload the files through a Cloud function/server and keep track of the total size that a user has uploaded. For that,
An easier alternative would be to use Cloud Storage Triggers which will trigger a Cloud function every time a new file is uploaded. You can check the object size using the metadata and keep adding it in the database. In this case, you can store total storage used by a user in custom claims in bytes.
exports.updateTotalUsage = functions.storage.object().onFinalize(async (object) => {
// check total storage currently used
// add size of new object to it
// update custom claim "size" (total storage in bytes)
})
Then you can write a security rule that checks sum of size of new object and total storage being used does not exceed 150 GB: allow write: if request.resource.size + request.auth.token.size < 150 * 1024 * 1024
You can also have a look at this thread too if you need a per user storage validation. The solution is a little bit tricky, but can be done with : https://medium.com/@felipepastoree/per-user-storage-limit-validation-with-firebase-19ab3341492d
Upvotes: 5
Reputation: 75745
Google Cloud (or Firebase environment) doesn't know the users. It knows your application and your application do.
if you want to have statistic per users you have to logs those data somewhere and perform sum/aggregations to have your metrics.
A usual way is to use Firestore to store those information and to increment the number of file or the total space used.
An unusual solution is to log each action in Cloud Logging and to perform a sink from Cloud Logging to BigQuery to find your metrics in BigQuery and perform aggregation directly from there (the latency is higher, all depends on what you want to achieve, sync or async check of those metrics)
Upvotes: 0