Is there a limit on the layers of encryption a file can have?
Consider symmetric GPG encryption of a given file my_file.txt
. Something like (in command line)
gpg --symmetric --cipher-algo AES256 my_file.txt
After suppying the prompt with the new password, the above produces my_file.txt.gpg
. I could then encrypt again:
gpg --symmetric --cipher-algo AES256 my_file.txt.gpg
(where you would want to set a different password)
And so on. Is there a limit on how many iterations of the above I can do? It seems to me there isn't, as symmetric encryption just takes a piece of text and transforms it into another, without ever asking what the piece of text is in the first place. Is this true?
encryption gnupg
|
show 3 more comments
Consider symmetric GPG encryption of a given file my_file.txt
. Something like (in command line)
gpg --symmetric --cipher-algo AES256 my_file.txt
After suppying the prompt with the new password, the above produces my_file.txt.gpg
. I could then encrypt again:
gpg --symmetric --cipher-algo AES256 my_file.txt.gpg
(where you would want to set a different password)
And so on. Is there a limit on how many iterations of the above I can do? It seems to me there isn't, as symmetric encryption just takes a piece of text and transforms it into another, without ever asking what the piece of text is in the first place. Is this true?
encryption gnupg
7
Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used.
– kelalaka
13 hours ago
2
no there's not, not until you run out of harddrive space from the encryption size overhead, anyway.
– user1067003
11 hours ago
2
However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort.
– A. Hersean
8 hours ago
2
@A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords?
– Chris H
6 hours ago
1
@VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual.
– Chris H
5 hours ago
|
show 3 more comments
Consider symmetric GPG encryption of a given file my_file.txt
. Something like (in command line)
gpg --symmetric --cipher-algo AES256 my_file.txt
After suppying the prompt with the new password, the above produces my_file.txt.gpg
. I could then encrypt again:
gpg --symmetric --cipher-algo AES256 my_file.txt.gpg
(where you would want to set a different password)
And so on. Is there a limit on how many iterations of the above I can do? It seems to me there isn't, as symmetric encryption just takes a piece of text and transforms it into another, without ever asking what the piece of text is in the first place. Is this true?
encryption gnupg
Consider symmetric GPG encryption of a given file my_file.txt
. Something like (in command line)
gpg --symmetric --cipher-algo AES256 my_file.txt
After suppying the prompt with the new password, the above produces my_file.txt.gpg
. I could then encrypt again:
gpg --symmetric --cipher-algo AES256 my_file.txt.gpg
(where you would want to set a different password)
And so on. Is there a limit on how many iterations of the above I can do? It seems to me there isn't, as symmetric encryption just takes a piece of text and transforms it into another, without ever asking what the piece of text is in the first place. Is this true?
encryption gnupg
encryption gnupg
asked 13 hours ago
luchonacholuchonacho
527139
527139
7
Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used.
– kelalaka
13 hours ago
2
no there's not, not until you run out of harddrive space from the encryption size overhead, anyway.
– user1067003
11 hours ago
2
However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort.
– A. Hersean
8 hours ago
2
@A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords?
– Chris H
6 hours ago
1
@VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual.
– Chris H
5 hours ago
|
show 3 more comments
7
Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used.
– kelalaka
13 hours ago
2
no there's not, not until you run out of harddrive space from the encryption size overhead, anyway.
– user1067003
11 hours ago
2
However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort.
– A. Hersean
8 hours ago
2
@A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords?
– Chris H
6 hours ago
1
@VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual.
– Chris H
5 hours ago
7
7
Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used.
– kelalaka
13 hours ago
Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used.
– kelalaka
13 hours ago
2
2
no there's not, not until you run out of harddrive space from the encryption size overhead, anyway.
– user1067003
11 hours ago
no there's not, not until you run out of harddrive space from the encryption size overhead, anyway.
– user1067003
11 hours ago
2
2
However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort.
– A. Hersean
8 hours ago
However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort.
– A. Hersean
8 hours ago
2
2
@A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords?
– Chris H
6 hours ago
@A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords?
– Chris H
6 hours ago
1
1
@VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual.
– Chris H
5 hours ago
@VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual.
– Chris H
5 hours ago
|
show 3 more comments
3 Answers
3
active
oldest
votes
Theoretically, there's no limit on the number of times you can encrypt a file. The output of an encryption process is again a file, which you can again pass it on to a different algorithm and get an output.
The thing is, at decryption side, it will have to be decrypted in LIFO (last in, first out) style, with the proper passwords.
For example, if your file was first encrypted with algo1
with password abcde
, and then it was encrypted with algo2
with password vwxyz
, then it will have to be decrypted first with algo2
(with password vwxyz
), and then with algo1
(with password abcde
).
This method makes sense if you're sending the keys through different media or channels. This method would be of little use if all the passwords are sent through the same channel.
6
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
3
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (soencryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.
– Giacomo Alzetta
5 hours ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
add a comment |
It's correct that there's no limit on the number of times you can encrypt a file, but it's not necessarily the case that you must decrypt in LIFO order.
You can always be sure that LIFO decryption will work, but certain multiply encrypted files can be decrypted out of order without affecting the result (depending on which algorithms were used for encryption):
Consider encrypting the same file twice using 1 Time Pad (XOR) with different keys. You can decrypt in either order, because (A xor B) xor C == (A xor C) xor B for every bit.
(This would be a comment if I had 50 rep, feel free to edit the other answer and delete this one.)
EDIT: See this question for more details on this edge case.
New contributor
4
The thing is, XOR (oradd mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.
– Martin Bonner
8 hours ago
2
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
2
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
1
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
2
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
|
show 1 more comment
- So, sooner or later you will be out of space.
GnuPG uses CFB mode of operation for symmetric encryption (defined in rfc4880). The CFB mode requires an IV with 128-bit size for AES encryption and it doesn't need for a padding.
While theoretically there is no limit as pointed by the other answer; there is a practical limit due to the file size increase. For example, I've encrypted a file with size 163 bytes then the result was 213 bytes, after re-encrypting the previous the result becomes 295 bytes, 382 bytes,473 bytes,...
These bytes also includes packet of GnuPG. So, sooner or later you will be out of space.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "162"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsecurity.stackexchange.com%2fquestions%2f203750%2fis-there-a-limit-on-the-layers-of-encryption-a-file-can-have%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
Theoretically, there's no limit on the number of times you can encrypt a file. The output of an encryption process is again a file, which you can again pass it on to a different algorithm and get an output.
The thing is, at decryption side, it will have to be decrypted in LIFO (last in, first out) style, with the proper passwords.
For example, if your file was first encrypted with algo1
with password abcde
, and then it was encrypted with algo2
with password vwxyz
, then it will have to be decrypted first with algo2
(with password vwxyz
), and then with algo1
(with password abcde
).
This method makes sense if you're sending the keys through different media or channels. This method would be of little use if all the passwords are sent through the same channel.
6
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
3
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (soencryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.
– Giacomo Alzetta
5 hours ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
add a comment |
Theoretically, there's no limit on the number of times you can encrypt a file. The output of an encryption process is again a file, which you can again pass it on to a different algorithm and get an output.
The thing is, at decryption side, it will have to be decrypted in LIFO (last in, first out) style, with the proper passwords.
For example, if your file was first encrypted with algo1
with password abcde
, and then it was encrypted with algo2
with password vwxyz
, then it will have to be decrypted first with algo2
(with password vwxyz
), and then with algo1
(with password abcde
).
This method makes sense if you're sending the keys through different media or channels. This method would be of little use if all the passwords are sent through the same channel.
6
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
3
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (soencryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.
– Giacomo Alzetta
5 hours ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
add a comment |
Theoretically, there's no limit on the number of times you can encrypt a file. The output of an encryption process is again a file, which you can again pass it on to a different algorithm and get an output.
The thing is, at decryption side, it will have to be decrypted in LIFO (last in, first out) style, with the proper passwords.
For example, if your file was first encrypted with algo1
with password abcde
, and then it was encrypted with algo2
with password vwxyz
, then it will have to be decrypted first with algo2
(with password vwxyz
), and then with algo1
(with password abcde
).
This method makes sense if you're sending the keys through different media or channels. This method would be of little use if all the passwords are sent through the same channel.
Theoretically, there's no limit on the number of times you can encrypt a file. The output of an encryption process is again a file, which you can again pass it on to a different algorithm and get an output.
The thing is, at decryption side, it will have to be decrypted in LIFO (last in, first out) style, with the proper passwords.
For example, if your file was first encrypted with algo1
with password abcde
, and then it was encrypted with algo2
with password vwxyz
, then it will have to be decrypted first with algo2
(with password vwxyz
), and then with algo1
(with password abcde
).
This method makes sense if you're sending the keys through different media or channels. This method would be of little use if all the passwords are sent through the same channel.
answered 13 hours ago
pripri
3,7031727
3,7031727
6
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
3
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (soencryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.
– Giacomo Alzetta
5 hours ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
add a comment |
6
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
3
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (soencryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.
– Giacomo Alzetta
5 hours ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
6
6
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
Real-world example: You are downloading a password-protected ZIP archive containing DRM-protected media files from a HTTPS url through an IPSec-encrypted network using TOR Browser.
– Philipp
6 hours ago
3
3
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (so
encryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.– Giacomo Alzetta
5 hours ago
LIFO may not be necessary. I'm sure you can design algorithms 1 and 2 where the order of encryptions does not change the resul (so
encryptA(encryptB(m)) == encryptB(encryptA(m))
). Now I don't think that this necessarily would decrease the encryption strength. AFAIK current algorithms do not support this.– Giacomo Alzetta
5 hours ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
@GiacomoAlzetta most stream ciphers actually work this way (if you extract stuff like initialization vectors, padding and MACs out), because they are just XORing the keystream with the plain text.
– Paŭlo Ebermann
33 mins ago
add a comment |
It's correct that there's no limit on the number of times you can encrypt a file, but it's not necessarily the case that you must decrypt in LIFO order.
You can always be sure that LIFO decryption will work, but certain multiply encrypted files can be decrypted out of order without affecting the result (depending on which algorithms were used for encryption):
Consider encrypting the same file twice using 1 Time Pad (XOR) with different keys. You can decrypt in either order, because (A xor B) xor C == (A xor C) xor B for every bit.
(This would be a comment if I had 50 rep, feel free to edit the other answer and delete this one.)
EDIT: See this question for more details on this edge case.
New contributor
4
The thing is, XOR (oradd mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.
– Martin Bonner
8 hours ago
2
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
2
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
1
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
2
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
|
show 1 more comment
It's correct that there's no limit on the number of times you can encrypt a file, but it's not necessarily the case that you must decrypt in LIFO order.
You can always be sure that LIFO decryption will work, but certain multiply encrypted files can be decrypted out of order without affecting the result (depending on which algorithms were used for encryption):
Consider encrypting the same file twice using 1 Time Pad (XOR) with different keys. You can decrypt in either order, because (A xor B) xor C == (A xor C) xor B for every bit.
(This would be a comment if I had 50 rep, feel free to edit the other answer and delete this one.)
EDIT: See this question for more details on this edge case.
New contributor
4
The thing is, XOR (oradd mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.
– Martin Bonner
8 hours ago
2
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
2
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
1
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
2
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
|
show 1 more comment
It's correct that there's no limit on the number of times you can encrypt a file, but it's not necessarily the case that you must decrypt in LIFO order.
You can always be sure that LIFO decryption will work, but certain multiply encrypted files can be decrypted out of order without affecting the result (depending on which algorithms were used for encryption):
Consider encrypting the same file twice using 1 Time Pad (XOR) with different keys. You can decrypt in either order, because (A xor B) xor C == (A xor C) xor B for every bit.
(This would be a comment if I had 50 rep, feel free to edit the other answer and delete this one.)
EDIT: See this question for more details on this edge case.
New contributor
It's correct that there's no limit on the number of times you can encrypt a file, but it's not necessarily the case that you must decrypt in LIFO order.
You can always be sure that LIFO decryption will work, but certain multiply encrypted files can be decrypted out of order without affecting the result (depending on which algorithms were used for encryption):
Consider encrypting the same file twice using 1 Time Pad (XOR) with different keys. You can decrypt in either order, because (A xor B) xor C == (A xor C) xor B for every bit.
(This would be a comment if I had 50 rep, feel free to edit the other answer and delete this one.)
EDIT: See this question for more details on this edge case.
New contributor
edited 2 hours ago
New contributor
answered 9 hours ago
Steven JacksonSteven Jackson
572
572
New contributor
New contributor
4
The thing is, XOR (oradd mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.
– Martin Bonner
8 hours ago
2
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
2
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
1
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
2
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
|
show 1 more comment
4
The thing is, XOR (oradd mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.
– Martin Bonner
8 hours ago
2
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
2
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
1
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
2
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
4
4
The thing is, XOR (or
add mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.– Martin Bonner
8 hours ago
The thing is, XOR (or
add mod n
) with a one-time pad is pretty much the only algorithm that can be decrypted in an arbitrary order like this.– Martin Bonner
8 hours ago
2
2
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
@MartinBonner What about the algorithms used for Diffie-Hellman key exchange?
– Solomon Ucko
8 hours ago
2
2
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
@SolomonUcko Key Exchange is not encryption. What did you have in mind?
– Martin Bonner
5 hours ago
1
1
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
@MartinBonner Good point, nevermind. I wasn't paying enough attention.
– Solomon Ucko
5 hours ago
2
2
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
Arbitatry encryption order works for a "plain encryption" with any xor-based stream cipher. However most practical cryptosystems add additional data (IV, integrity checks etc) to the ciphertext which would break out of order decryption.
– Peter Green
2 hours ago
|
show 1 more comment
- So, sooner or later you will be out of space.
GnuPG uses CFB mode of operation for symmetric encryption (defined in rfc4880). The CFB mode requires an IV with 128-bit size for AES encryption and it doesn't need for a padding.
While theoretically there is no limit as pointed by the other answer; there is a practical limit due to the file size increase. For example, I've encrypted a file with size 163 bytes then the result was 213 bytes, after re-encrypting the previous the result becomes 295 bytes, 382 bytes,473 bytes,...
These bytes also includes packet of GnuPG. So, sooner or later you will be out of space.
add a comment |
- So, sooner or later you will be out of space.
GnuPG uses CFB mode of operation for symmetric encryption (defined in rfc4880). The CFB mode requires an IV with 128-bit size for AES encryption and it doesn't need for a padding.
While theoretically there is no limit as pointed by the other answer; there is a practical limit due to the file size increase. For example, I've encrypted a file with size 163 bytes then the result was 213 bytes, after re-encrypting the previous the result becomes 295 bytes, 382 bytes,473 bytes,...
These bytes also includes packet of GnuPG. So, sooner or later you will be out of space.
add a comment |
- So, sooner or later you will be out of space.
GnuPG uses CFB mode of operation for symmetric encryption (defined in rfc4880). The CFB mode requires an IV with 128-bit size for AES encryption and it doesn't need for a padding.
While theoretically there is no limit as pointed by the other answer; there is a practical limit due to the file size increase. For example, I've encrypted a file with size 163 bytes then the result was 213 bytes, after re-encrypting the previous the result becomes 295 bytes, 382 bytes,473 bytes,...
These bytes also includes packet of GnuPG. So, sooner or later you will be out of space.
- So, sooner or later you will be out of space.
GnuPG uses CFB mode of operation for symmetric encryption (defined in rfc4880). The CFB mode requires an IV with 128-bit size for AES encryption and it doesn't need for a padding.
While theoretically there is no limit as pointed by the other answer; there is a practical limit due to the file size increase. For example, I've encrypted a file with size 163 bytes then the result was 213 bytes, after re-encrypting the previous the result becomes 295 bytes, 382 bytes,473 bytes,...
These bytes also includes packet of GnuPG. So, sooner or later you will be out of space.
edited 23 mins ago
answered 5 hours ago
kelalakakelalaka
1,0352717
1,0352717
add a comment |
add a comment |
Thanks for contributing an answer to Information Security Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsecurity.stackexchange.com%2fquestions%2f203750%2fis-there-a-limit-on-the-layers-of-encryption-a-file-can-have%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
7
Assuming that it uses a mode that requires IV; finally, you will get out of space since every time you will have an output with increased size. Also, one might consider the same if padding is used.
– kelalaka
13 hours ago
2
no there's not, not until you run out of harddrive space from the encryption size overhead, anyway.
– user1067003
11 hours ago
2
However, be warned that --all other things being equal-- encrypting a file twice with the same algorithm (and different passwords) will not significantly improve its security: because if someone can break the first layer of encryption, it's most probable that they will be able to break the second layer with the same amount of effort.
– A. Hersean
8 hours ago
2
@A.Hersean presumably it still protects against an attacker who has managed to capture one of the passwords?
– Chris H
6 hours ago
1
@VLAZ that depends if the passwords are ever kept together, or even in the hands of the same individual.
– Chris H
5 hours ago