I wonder if everyone who works on stuff like this is pro-DRM/anti-freedom, because while I've seen plenty of DRM-breaking papers which paint a very negative view of their findings (this one included), I can't recall seeing a single one which takes the opposite view that this is another step forward for freedom and right-to-repair. Are the researchers really believing that this is a bad thing, or is it because they're afraid of taking that position since others could disapprove and reject their paper?
1. Security researchers, so they can see what malware may be lurking in FPGA bitstreams.
2. Open source developers working on FPGA bitstream compilers.
3. People who want to steal proprietary IP cores.
It hurts:
1. People who chose the part because of the closed bitstream particularly. In part because they made security decisions that the bitstream wasn't open.
2. Anyone who bought the products based upon the marketed security claims of the product (Hopitals/DoD/etc)
Woah, from what I undeddtand, the bitstream is the fpga equivalent of the compiler output? Then selling something as "secure" because nobody can know what code it is running would be security through obscurity no? How would vendors get away with that sort of BS?
To be clear: just talking about the confidentiality requirement here. Authenticity (ie code signing, right?) is obviously something very useful especially in these cases.
This is about bitstream encryption, so there is an expectation of confidentiality. The keys needed to decrypt the bitstream are stored in nonvolatile memory on the FPGA itself. Assuming that it is implemented correctly (evidently not in this case), it is impossible to decrypt the bitstream without analyzing the FPGA die itself, using tools that are usually beyond what a casual attacker might have. It probably won't stop a nation-state from figuring out how to read out your FPGA design, but it will probably slow down your competitors.
Yes, for IP protection I get why that's interesting. But crucially, it's the vendor's interest. For a hospital or such, the interest is actually opposed to this. They should be looking for secure software that is as open as possible to allow for audit and servicing if needed. So selling DRM as something that somehow makes the customer more secure is BS.
Circumvention of DRM is illegal or at least very much frowned-upon in many jurisdictions, while research on how to produce better DRM is not.
Framing it as "we broke DRM, isn't that great?" would be like framing a paper about a more effective silencer as "we made it easier to get away with murder, isn't that great?" (when instead it could be about "we made our special forces' jobs safer").
Plus, research papers in technical fields should try to be neutral, not a place for political activism.