OpenTitan Shipping in Production

126 points by rayhaanj a day ago on hackernews | 26 comments

yjftsjthsd-h | a day ago

Clicking through links eventually led to https://lowrisc.org/ibex/ -

> Ibex® is a small and highly configurable open-source RISC-V embedded processor available under an Apache 2.0 licence. It is formally verified and very well validated, and it has excellent toolchain integration, which has led many companies to use it in their commercial SoCs.

> [...]

> Ibex is the main CPU in the OpenTitan® root of trust, which has brought the quality of the design and documentation to new heights.

So that's neat.

gchadwick | a day ago

I worked on OpenTitan for around 5 years at lowRISC. It certainly has its ups and downs but it's generated some great stuff and I'm very glad to see hit proper volume production like this. Whilst there's definitely open source chips out there and lots more using bits of open source that don't actually advertise this fact I believe this is the first chip with completely open RTL that's in a major production volume use case.

One of highlights working on OpenTitan was the amount of interest we got from the academic community. Work they did could actually get factored into the first generation silicon making it stronger. Ordinarily chips like that are kept deeply under wraps and the first time the wider security community can take a look at them development has long completed so anything they might find could only effect generation 2 or 3 of the device.

Academic collaboration also helped get ahead in post quantum crypto. This first generation chip has limited capabilities there but thanks to multiple academics using the design as a base for their own PQC work there was lots to draw on for future designs.

I'm no longer at lowRISC so I don't know where OpenTitan is going next but I look forward to finding out.

IshKebab | a day ago

This is really great. OpenTitan has some useful IP components that can definitely be reused, and it's really cool that this is open. Nice one Google. I have to minority nitpick though:

> both individual IP blocks and the top-level Earl Grey design have functional and code coverage above 90%—to the highest industry standards—with 40k+ tests running nightly

This is definitely not "to the highest industry standards". I've worked on projects where we got to 100% on both for most of the design. It's definitely a decent commercial standard though - way above most open source verification quality.

LoganDark | a day ago

I think they're saying the coverage they have is to the highest industry standards, not that 90% is a high standard.

gchadwick | 21 hours ago

You can see the latest nightly results here: https://opentitan.org/dashboard/index.html note there are some 100% figures.

Having spent several years working on OT I can tell you that most of the gaps are things that should be waived anyway. Getting waiver files reliably integrated into that flow has been problematic as those files are fragile, alter the RTL and they typically break as they refer to things by line number or expect a particular expression to be identical to when you did a waiver for it.

This has all been examined and the holes have been deemed unconcerning, yes ideally there'd be full waivers documenting this but as with any real life engineering project you can't do everything perfectly! There is internal documentation explaining the rationale for why the holes aren't a problem but it's not public.

IshKebab | 20 hours ago

> Getting waiver files reliably integrated into that flow has been problematic as those files are fragile, alter the RTL and they typically break as they refer to things by line number or expect a particular expression to be identical to when you did a waiver for it.

Yeah last time I did this we used regexes but I really don't like that solution. I think the waiver should go in the RTL itself. I don't know why nobody does that - it's standard practice in software. SV even supports attributes exactly for this sort of thing. The tools don't support it but you could make a tool to parse the files and convert it to TCL. I've done something like that using the Rust sv-parser crate before. Tedious but not impossible.

Also we found the formal waiver analysis tools to be very effective for waiving unreachable code, in case you aren't using those.

Congrats on the silicon anyway!

gchadwick | 18 hours ago

> Also we found the formal waiver analysis tools to be very effective for waiving unreachable code, in case you aren't using those.

Yes we had used them just never got it slickly integrated into the verification dashboard. We had used this kind of analysis for internal sign off. You could generate the waivers manually and check them in but that suffers from the problem discussed above. Plus as OpenTitan was a cross company project you run into EDA licensing issues where not everyone has access to the same set of tools and a UNR flow could be running fine on one partner's infrastructure but isn't workable everywhere for multitude of reasons.

The ideal would be the nightly regression would do the UNR flow to generate the waivers and apply them when generating coverage but as ever there's only so much engineering time to go around and always other priorities.

ggm | a day ago

I'm not seeking to criticise this product, I think this is a great development.

But, for almost all people this is shifting from one kind of "trust me bro" to .. another. We're not going to be able to formally prove the chip conforms to some (verilog?) model, has no backdoors, side channels, you-name-it. We're in the same place we were, with the same questions. Why do we trust this and the downstream developments? Because we do.

I know people who worked on cryptech, and I definitely had trust in their work, personal commitment to what they did, but that's "who you know" trust. The non transitive quality of this kind of trust is huge.

To be more critical my primary concern will be how deployment of this hardware is joined by significantly less benign design choices like locked bootloaders, removal of sideloads. To be very clear that's a quite distinct design choice, but I would expect to see it come along for the ride.

To be less critical, will this also now mean we get good persisting on device credentials and so can do things like X.509 certs for MAC addresses and have device assurance on the wire? Knowing you are talking to the chipset which signed the certificate request you asserted to before shipping is useful.

michaelt | a day ago

> To be more critical my primary concern will be how deployment of this hardware is joined by significantly less benign design choices like locked bootloaders, removal of sideloads. To be very clear that's a quite distinct design choice, but I would expect to see it come along for the ride.

A justifiable concern, given sentences like "strongest possible security guarantees that the code being executed is authorized and verified" and "can be used across the Google ecosystem and also facilitates the broader adoption of Google-endorsed security features across the industry"

benlivengood | a day ago

Take a look at how Matter handles this; manufacturer certificate to vouch for hardware integrity which gets superceded by the fabric's root CA on commissioning (enrollment in the fabric).

This is basically the best we can hope for until we get nanofabs at home and can build our own secure enclaves in our garages.

Trust decision theory goes like this; it it were possible for the manufacturer to fully control the device then competitors would not use it, so e.g. wide industry adoption of OpenTitan would be evidence of its security in that aspect. Finally, if devices had flaws that allowed them to be directly hacked or their keys stolen then demonstrating it would be straightforward and egg on the face of the manufacturer who baked their certificate on the device.

Final subject; 802.1x and other port-level security is mostly unnecessary if you can use mTLS everywhere which is what ubiquitous hardware roots of trust allows. Clearly it will take a while for the protocol side to catch up; but I hope that eventually we'll be running SPIFFE or something like it at home.

octagons | a day ago

This is the general premise behind Ken Thompson’s “Reflections on Trusting Trust” and I highly recommend you read it if this is something that interests you.

fc417fc802 | a day ago

> not going to be able to formally prove the chip conforms to some (verilog?) model

Sure you can. Get together as a group. Purchase a large lot of chips. Select several at random. Shave them down layer by layer, imaging them with an SEM. You now have an extremely high level of confidence that all the chips in the lot are good.

Physical security aside, I share your concerns about the abusive corporate behavior that widespread deployment of such hardware might enable.

> Knowing you are talking to the chipset which signed the certificate request you asserted to before shipping is useful.

Can't an fTPM with a sealed secret already provide that assurance? Or at least the assurance that you actually care about - that the software you believe to be running actually is. At least assuming we stop getting somewhat regular exploits against the major CPU vendors.

nerdsniper | 19 hours ago

These things should be manufactured to be IRIS-compatible. IRIS is the "Infra-Red, In Situ" technique which lets you image the silicon of a chip through the packaging to verify that you don't have a counterfeit.

https://arxiv.org/pdf/2303.07406

Like, for example, the Boachip-1x MCU.

https://www.cnx-software.com/2026/03/04/dabao-board-features...

PunchyHamster | 18 hours ago

I'm more worried by the motivation for the whole secure chain. We will not own our devices and the encryption keys will be stored in vault of <ecosystem provider> like MS or Google, free to peruse by the government

The entire push seems to be motivated by actors that want to deny users access to their own devices in thinly veiled promises of "security".

It's basically asking someone to give their company your house keys on nothing more than "trust me bro".

And it's completely opposite of how it should be, it should be my device that I then can give the vendor app limited sandbox that I can access fully, not the other way around.

aappleby | a day ago

Fiiiiinally! Yay!

Worked with the OT team at Google years ago and am glad to see this stuff finally taped out.

octagons | a day ago

Are there any generally available microcontrollers with this block inside?

“Open source” has a very different meaning when it comes to silicon.

aappleby | a day ago

OpenTitan _is_ a microcontroller, just one with a _lot_ of security hardware (and security proofs).

It's intended to be integrated into a larger SoC and used for things like secure boot, though you could certainly fab it with its own RAM and GPIO and use it standalone.

ottah | a day ago

Ah, I see. It's just another fucking tpm, which let's venders approve or deny execution of signed binaries. So more infrastructure to attack general computing.

UltraSane | 19 hours ago

No, TPMs and HSMs are fundamentally nothing more than secure hardware dedicated to storing private keys in a way that makes accessing the plaintext incredibly hard. All of modern computer security is based on them.

pabs3 | 16 hours ago

... and usually deployed in a user-hostile manner.

UltraSane | 12 hours ago

Any evidence of this? Computer security was a complete disaster before hardware roots of trust became standard.

NewJazz | 6 hours ago

Both things can be true.

pabs3 | 23 hours ago

Whos keys does this thing trust by default?

PunchyHamster | 18 hours ago

I'd imagine whatever was loaded at factory.

So, google, samsung, take your pick. User ones ? Nah, we cant trust user

pabs3 | 23 hours ago

> will support ... secure boot and attestation.

Not something I would want to touch.

karlkloss | 20 hours ago

I read until Google.