Reputation: 11
I am doing GPG/PGP encryption/Decryption on spark. I have some large files. Sizes up to 30 GB . I am using a spark cluster with following configuration -
MY Question is -> How does spark handle encryption decryption when the file is large and can not fit in single nodes memory ? Spark is Optimised for parallel processing , especially for tasks that can be divided in chunks , Whereas Encryption/Decryption tends to be sequential operation and cant be optimised with parallel process. So how does spark manage to decrypt a file larger than its memory ?
Upvotes: 1
Views: 52