Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

1024 is for RSA-1024, which is believed to be broken by classical means at this point. Everyone doing anything with RSA is on 4k or larger.


2048. There is no plausible conventional attack on 2048; whatever breaks 2048 is probably going to break 4096, as I understand it.

https://crypto.stackexchange.com/questions/1978/how-big-an-r...


> Everyone doing anything with RSA is on 4k or larger.

The Let's Encrypt intermediate certificates R10 and R11 seem to be only 2048 bit.


A signature (not encryption) with short-term valid life is fine at 2048 still.


They are? The short term recommendation is 3072, and I still see lots of 2048. Actually, it's mostly 2048.


Reminder to anyone if DKIM keys haven't been rotated in a while they might still be 1024. Eg., Google Workspace but new keys are 2048 now.


I took this conversation to be about ECC, not RSA.


My completely unfounded tin foil hat at the moment is that ECC was pushed as a standard not because it was faster/ smaller, but the smaller bit size makes it less quantum resistant and is more prone to be broken first (if not already) via quantum supremacy.


ECC is easier to implement right. Even the older ones that are full of footguns have orders of magnitude less footguns than RSA.

So don't fear them due to unfounded theories.


Is there consensus on what signature validation is correct? https://hdevalence.ca/blog/2020-10-04-its-25519am


IMO, that article has a clear conclusion that you should aim your software at the libsodium-1.0.16 pattern (no edge cases).

The problem it's presenting is more about software on the wild having different behavior... And if "some people connect on the internet and use software that behaves differently from mine" is a showstopper for you, I have some really bad news.


An intersting read. So the general idea as I understand is that one can make signatures that pass or does not pass validation depending on the implementation while all implementations do protect against forgery.

In my opinion the correct approach here is the most liberal one for the Q, R points. One checks each point cofactor at parsing 8P=0 and then use unbatched equation for verification. This way implementations can be made group agnostic.

Having group agnostic implementations is important as it creates a proper separation of concerns between curve implementations and the code that uses them. For instance if we were to accept strict validation as the ground truth and best practice one would have enormously hard time specifying verifiers for zero knowledge proofs and would also double time and code for the implementers without any effect on soundness.


You're right, that's pretty unfounded.


Discrete log doesn't use Shor's algorithm, and appears to need more qubits to break (per key bit).


Is it broken? Seems still no one solved RSA-1024 challenge.


The state actors aren't going to reveal their capabilities by solving an open challenge.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: