Discussion in 'Article Discussion' started by CardJoe, 13 Jan 2010.
If it's man-made, it's breakable. Meaning, if it's an algorithm created by people, even if calculated on a computer, it's reversible. Also, 1500 years? Quite crazy honestly, with so many people poking around into breaking codes.
WHAT 9000 ?!?!?!
Compared to the recently discovered exploitable flaw in SSL, this isn't actually the biggest threat to SSL security. A hacker is rarely going to brute force an attack if there is an easier alternative.
=>Everything<= is crackable with Brute-force.
Given time and loads of computational power.
Man in the middle seems very far fetched with the routing involved in the internet (you would imagine it would require phsyical access to a 'secure' datacenter/edge router)
This is true but your statement is misleading. If it takes 1000 years to brute force a code then it's a little unfeasible for a hacker to want to do so, wouldn't you say?
Interesting article though... however I wonder if it's more a case of holes and exploits in software that are more likely to be the threat nowadays as mentioned by a poster above. Whats the point in breaking down the front door when the window is unlocked.
I'm pretty sure the majority of 'hackers' don't have access to a supercomputer cluster, and any that do aren't likely to waste their time intercepting my SSL traffic (It makes much more sense to target a larger repository of data than a single web transaction). More likely they'll be hacking Google.
That depends. 1,000 years with an 'average' computer isn't off-putting if you have a botnet.
Dirty maths: 1000 x 365 x 24 = 8.76 million = the number of 'average computers' required to crack a "1000-year" encryption in an hour - not out of the realm of feasibility I would posit.
EDIT: Someone correct me if I'm over-simplifying please.
Botnets are ordinarily in the range of hundreds of thousands tops, not millions. You're looking at days or weeks even with a botnet I'd guess.
between the sieving and matrix runs took them 15 days on the array (10 day sieving and 5 on the matrix).. least in the pdf what it says- 1024 bit is still out of reach for the next 5 years, think they want double checks in the future and go from there
that's pretty cool research =] guys are all into it..
How many souped up PC's is a "cluster".
Tens, hundreds or thousands?
Especially now when you have machines like the FASTRA. Get a few of those and you can do a lot of kinky things with pi. Just not while you are playing crysis......
This is a non sequitur. Public key algorithms are valuable because they allow you to verify the security of a communication without exchanging keys in advance. This is important because transmitting keys over an insecure medium (such as the net, phone system, postal service or any situation where you cannot guarantee privacy) is dangerous and because maintaining separate keys for every party you wish to communicate is impractical.
Two factor authentication simply means replacing a password with, for example, a password and a iris scan. "Something you have, something you know". The main point of this is to make it harder for idiots to compromise their own authentication credentials (by writing them down) or to prevent people from intentionally sharing their credentials.
This bears no relevance to the security of RSA because the attack does not rely on the user giving an attacker access to their key but rather allows the key to be factored from the encrypted data.
This also has nothing to do with the security of RSA and is a pretty dubious claim anyway.
Andy Cordial is either being misquoted in the original article or is completely clueless about cryptography. Origin Storage sell the sort of pin-protected hard drives he is recommending, so take what you will from that.
In regards to the actual news story, while academically interesting it isn't a big deal from a security standpoint. 1024 bits was a short key in 1995. Anyone actually using a 768 bit key doesn't care about security.
RSA has always been about keeping the key-length in the area between what's feasible to compute for encryption/decryption and what's feasible to brute-force. As computing technology gets faster and faster (see Moore's law), small keys will continue to get broken - it's the natural evolution of technology.
Note, however, that RSA is NP-complete: there is (so far) no silver bullet to make cracking it any faster than the brute-force approach. Double your key size, you'll be fine for the next several years. When they get close to cracking that, double it again, and you'll be fine again.
Tl;dr: Meh. RSA is still good, just increase your key length.
this is pretty neat. they actually broke the math, not the programing.
while it is pretty amazing that they found a fast way to factor RSA i agree with both flax and techno-dan. 2 factor authentication has nothing to do with data encryption, and when your RSA key gets too short just make it longer.
I am so disturbed
Whoa, now that is big news! +1 rep
I was thinking along the same lines. ] wonder what they mean by "average" computers these days and how they would compare to the Bt folding rig, for example.
if all the folders in the world went on a brute force attack they could prolly do it in a couple days or so.
+1 to SSL having bigger problems than 768 bit encryption.
So in 2030 SSL should be at what 16000-bit?
meh, 365,000 computers and day > 8.76 million computers and an hour
Separate names with a comma.