Is this the first type of sum-type option choosing statement present for C++ unions? I've been waiting for this feature since the year 1978.
Still, it's a wasted opportunity not to have a language-level overload to the `switch` statement that allows nice pattern matching. Even with std::is_within_lifetime C++ unions are prone to errors and hard to work with.
The problem pointed out in the article seems a little silly. We're adding an entire language feature because someone wanted an optional bool class? Why not just create a uint8_t with three values: OPTIONAL_BOOL_FASLE, OPTIONAL_BOOL_TRUE, OPTIONAL_BOOL_UNDEFINED?
Doing so takes the same space as a bool, and could be wrapped in a class if desired to provide a nicer interface.
The language is the problem, and WG21 hates fixing the language. The really bone headed stuff like "UB? in my Lexer?" got through as a language change and Barry has also had some success with "just fix it" language changes, that's what happened so that C++ can attempt what Rust's MaybeUninit<T> does, but mostly WG21 will take weird new intrinsics in the standard library, as here, over just fixing the programming language.
The whole problem only arises because accessing the union member as a character is allowed at runtime, but disallowed in constexpr. If that restriction were relaxed to be the same in both cases, the entire motivating problem would disappear...
Barry even explains, the transmutation is outlawed during compile time in C++. They could remove this prohibition but they did not.
Notice that e.g. Rust doesn't prohibit compile time transmutation, the provided core::mem::transmute literally does this operation, and it's const. The result is - as with similar situations in C++ - that if at compile time that's a 2 the compilation fails, 2 is not a boolean. You don't need all this bother because Rust's type system is better so Option<bool> already does have the same size as bool, but that's beside the point.
I would think it would have to work the same in both since otherwise C code using that behavior would not compile in C++, right?
I am not a C++ expert, but I'm surprised to hear that it is considered UB to access the other member since as far as I can tell a union is just a chunk of memory that can be interpreted differently depending on which union member you access?
So, if you had a union { bool b, char c }, and you set b = false, then I would think that accessing c would predictably give you a value of '\0'.
Granted, you probably shouldn't go around accessing an inactive union member like that, but when people say it's UB they make it sound like it's impossible to guarantee what data will be inside that byte, and it seems to me like that isn't true.
Actually I can't think of a single realistic use case for this; despite the constant trumpeting of C++ folks on how constexpr will save C++, constant expressions are just too limited for this to matter.
Mature lanagues like CPP should stop adding more features to the language/std. Adding features to the language just makes it more complex, and adding to the std library just adds more overhead, and maybe even security issues.
What no one wants to hear is rust is destined for the same fate. If you want to see the future of rust, look at C++.
Rust has a much better initial state, but the rules evolving the system (the governance model, the kinds of developers that work on it, etc.) are the same as C++ and so we should expect the same trajectory.
Unless you have a system that says "no" a lot, and occasionally removes features, programming languages decay, and the game has been (historically, before LLMs) to pick a language that would be in the sweet spot for the time that you need to use it, while keeping your eye out for something else to switch to once it becomes sufficiently unusable.
Is there ever a successful programming language that occasionally removes features? Like, not just a big, one-time backward-incompatible upgrade, but occasional feature removal?
Python removes features all the time in 3.x releases. For example, I was not a fan of the distutils removal in 3.12 which broke many legacy but otherwise functional packages. Deprecated functions and classes are also removed from packages regularly.
Java, since Java 9 deprecated for removal really means it.
.NET, the whole .NET Framework to modern (core) .NET migration, left several parts behind, the foor loop semantics change on C#, introduction of field keyword, and with.
Rust intentionally keeps its std library small and makes no promises about ABI; it seems to have resisted a lot of pressure to do the opposite from C++ fanatics. I don't agree that the C++ path is inevitable.
Even Go is encountering the same fate, albeit slower. It’s nearly impossible to remove a feature once it has seen adoption, especially without an alternative; whereas there are always patterns that are greatly simplified by some new feature, and when a language becomes large, these patterns become common enough that the absence of said feature becomes annoying.
Suspiciously, after Rob Pike retired from the project, the amount of language and standard library changes skyrocketed.
Many people now trying to get their thing into the language so they can add it to their list of accomplishments.
Clear evidence that you need someone saying "no" often.
> What no one wants to hear is rust is destined for the same fate. If you want to see the future of rust, look at C++. Rust has a much better initial state, but the rules evolving the system (the governance model, the kinds of developers that work on it, etc.) are the same as C++ and so we should expect the same trajectory.
Dear lord that is not the case. The C++ standardization process is extremely different from Rust's specification process, and the resulting pathologies are extremely dissimilar. Hell, C is fairly close to C++ in terms of process, and yet it still has its own set of pathologies.
The C++ committee is dominated not by experts on compiler implementation, but by people looking to get their own proposals incorporated into the standard, and is structurally organized in such a way that it can be difficult for any group to feel empowered to actually reject a feature. It should be noted that in the most recent batch of C++ papers, there was effectively an implementers' revolt: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2026/p39....
The Rust proposal process is much more ponderous, and when you take into account the lag between an accepted RFC and implementation and stabilization (and the fact that some accepted RFCs turn out to be unworkable and effectively get backed out without ever being stabilized), it's pretty clear that the actual development process is night-and-day different. For example, the Try trait in Rust still has yet to be stabilized, despite the RFC proposing it being introduced over nine years ago and a v2 RFC being accepted five years ago.
This kind of "but for us it's different" thinking is a little amusing.
I don't care about the implementation process or the RFCs or what-have-you.
If there is a democratic committee of humans that decides what goes in, and there is no bias for minimalism (e.g. 1/3 could strike down a proposal instead of 1/2) then the process will tend towards bloat.
The Rust RFC process requires essentially unanimous consent: there's no formal voting procedure, but the various teams can block any feature from going in.
But sure, keep on saying they're basically the same thing.
You can just set -std=c++03 and program like the language never evolved if that's your personal preference.
Other than that, there's always an interesting psychology at play in software engineering but it really seems to come out when people talk about C++ for some reason. Complexity is just needless bloat when it's a feature you aren't using, and it's an essential part of the language when it's a feature you are.
This particular feature impacts only compiler writers, unless you choose to use it. You can write C++ till the end of time and never use std::is_within_lifetime and it will have zero impact on you or your code. If it gets used in stdlib, there's presumably a reason, and if there isn't, then that's worth criticizing. But adding the feature has no impact on 99% or more of all developers.
you really need to hate yourself to still pay attention to such horrible stuff in 2026.
41 years after its invention, C++ still doesn't have networking support in its stdlib. excuses after excuses, they have millions justifications on why the stdlib doesn't need networking. but in the same time, some bureaucratic "committee members" struggling with their midlife crisis want you to waste your life on stuff like Std:Is_within_lifetime in the era of AI.
what a bloody load of joke!
Can't wait to see some high accurate coding agents start being able to port C++ code to rust with minimum human interventions to liberate people from the most bureaucratic nonsense in CS history. Some AI native language incorporated with concepts that were too complicated for human would be even better.
it has never been a better time to depreciate dinosaurs like C++!
Oh here we go again, someone demanding networking (of all things) in the standard library. Are you next going to demand a GUI toolkit too? Maybe an entire game engine and Vulkan/WebGPU implementation too while we're at it? Just because other languages do it does not mean it is a wise idea for C++ to follow suit. I mean, do I really need to point you to std::regex as an example of what happens when we try to add extraneous, hard to define problems to the STL? Do you really want to add something way more complicated than a regular expression engine to C++ (networking), with all that entails? Because I certainly don't.
I'm not a C++ programmer, and so in a sense I don't care whether they get networking but
1: Some of networking is vocabulary and so it obviously should live in your stdlib, and indeed for C++ it should be in what they call "freestanding", like Rust's core, where core::net::IPv4Addr lives. It is very silly if the software in this cheap embedded device and this PC web browser can't even agree on what an IP address is in 2026.
2: In practice the C++ stdlib is used as a dumping ground for stuff that ought to live in a package manager if C++ was a good language. That was true when networking was first proposed and it's still true now. It's why RCU and Hive are both in C++ 26. Those aren't vocabulary, and they aren't needed by the vast majority of programmers, but their proponents wanted them to be available out of the box and in C++ that means they must live in the stdlib.
> someone demanding networking (of all things) in the standard library
Networking is defacto in the standard library, because C++ standard library is almost always supplemented by whatever C functionality is lying around, and POSIX networking exists.
That they haven't felt the need to provide a more useful abstraction on top of the POSIX layer (and hey, maybe abstract over Microsoft's variant in the process) in the past 3 decades does seem like a miss
This continues the trend that the C++ language spec is too large for any person to understand, full of opaquely named things for obscure use cases. Maybe when most code is written by LLMs this kind of extension will be appreciated? Because the LLM can manage to get its large head around all of these obscure functionalities and apply them in the appropriate situations?
Since the birth of ChatGPT, people have been talking about if one day LLMs will be trained to write bytecode or even machine code directly, making future code incomprehensible for humans.
C Source code => Tradicional UNIX C compiler => ASM => object file
Now everyone is doing
AI tooling => C Source code => Tradicional UNIX C compiler => ASM => object file
For all pratical purposes, just like using a language like Nim, the workflow exposed to user can hide the middle steps.
Then there is the other take, if you start using agents that can be configured to do tool calling, it is hardly any different from low code applications, doing REST/GraphQL/gRPC calls orchestrated via flow charts, which is exactly what iPaSS tooling are offering nowadays, like Workato, Boomi,...
this is silly, we already have an algorithm for generating very efficient assembly/machine code from source code, this is like saying maybe one day llms will be able to replace sin() or an os kernel (vaguely remember someone prominent claiming this absurdity), like yes, maybe it could, but it will be super slow and inefficient, we already know a very (most?) efficient algorithm, what are we doing?
The more I interact with consteval and the whole template metaprogramming and codegen paradigm, the more I think it's completely inappropriate to shovel into stdlib. I don't think this should even be part of the language itself, but something more like a linter on top of the C++ language.
For most of us it seems you can get good at C++ or metaprogramming. But unless you want to make it your entire career you can't really do both with the same degree of effectiveness.
I really like C++, and I will probably continue using it forever. But really only the very small subset of the language that applies to my chosen field. I'm a "C with classes" kind of guy and templates and constexpr are pretty rare. Hell, half the time I don't even have stdlib on embedded platforms. It's kind of nice, actually.
We find constexpr (and associated templates) essential for when we need to avoid branch penalties. It makes the code so much simpler and cleaner than the alternative. I'm glad the language caters to the needs of everyone, even if any individual person (self included) only uses a little bit of it.
The cardinal question: is the benefit of removing that branch worth the increase in i-cache footprint? I think it depends quite a bit... but also, the speed increases IME from doing this kind of thing can result not merely from the branch removal, but from the code duplication itself. Even if the contents of the branch doesn't directly mention the condition, duplication allows the surrounding code to be separated in the branch predictor, and it's quite common that these conditions will correlate with different branch frequencies in the surrounding. code.
I work on a codebase where I am slowly detangling all of the tens of thousands of lines of `if constexpr` templates that were written by a guy who doesn't know how a modern CPU works. It's a bad meme with a very narrow field of beneficial applications. People who think a mispredicted branch is costly are never gonna believe the cost of a page table walk caused by an iTLB miss.
Narrow indeed. If the function is small enough for the i-cache pressure to not matter, it's probably going to get inlined and the condition gets optimized out anyway. If it's big enough, then it's unclear, but microbenchmarks will give you misleading results.
> is the benefit of removing that branch worth the increase in i-cache footprint?
Like everything else in the world, in general, it depends.
That said, among the limited number of times I've tried this, I don't recall a single case where I felt it would be worth it and it turned out to be detrimental.
1. this is being used in a method that is widely used in both the <true> and <false> contexts, which I believe means that branch prediction would not be great if it was simply the same instruction sequence in both contexts. I could be wrong about that.
2. the major benefit that I saw was not duplicating the much more substantial code in the "..." sections (before and after the constexpr) and thus making maintainance and continued evolution of the code more reliable.
I am actually glad that more and more of the metaprogramming techniques are built into the language itself, because people are going to try metaprogramming anyways, and their attempts at it are generally less readable without proper compiler support.
Anecdotally, I remember having to review a library written in C++98. It actually worked as promised, but it also did a lot of extremely clever things that were sort of unnecessary if we had just waited for C++11 with type_traits. We got rid of that library later after rewriting all the downstream dependencies.
Average C++ enyojer coping mechanism on C++ trend of naming things with the most convoluted (and often wrong) way possible. std::vector, std::monostate, std::unit, etc.
eptcyka | 7 hours ago
bvrmn | 6 hours ago
mFixman | 7 hours ago
Still, it's a wasted opportunity not to have a language-level overload to the `switch` statement that allows nice pattern matching. Even with std::is_within_lifetime C++ unions are prone to errors and hard to work with.
sebtron | 7 hours ago
https://en.cppreference.com/w/cpp/utility/variant.html
maleldil | 3 hours ago
sebtron | 2 hours ago
matthewkayin | 7 hours ago
Doing so takes the same space as a bool, and could be wrapped in a class if desired to provide a nicer interface.
ephou7 | 7 hours ago
I'm (fairly) sure there's a good reason for that language feature, but the justification the blog article gives is super weak.
tialaramex | 6 hours ago
The language is the problem, and WG21 hates fixing the language. The really bone headed stuff like "UB? in my Lexer?" got through as a language change and Barry has also had some success with "just fix it" language changes, that's what happened so that C++ can attempt what Rust's MaybeUninit<T> does, but mostly WG21 will take weird new intrinsics in the standard library, as here, over just fixing the programming language.
yearolinuxdsktp | 6 hours ago
swiftcoder | 6 hours ago
tialaramex | 6 hours ago
Notice that e.g. Rust doesn't prohibit compile time transmutation, the provided core::mem::transmute literally does this operation, and it's const. The result is - as with similar situations in C++ - that if at compile time that's a 2 the compilation fails, 2 is not a boolean. You don't need all this bother because Rust's type system is better so Option<bool> already does have the same size as bool, but that's beside the point.
IsTom | 7 hours ago
DaiPlusPlus | 5 hours ago
(for those wanting context, it's from this post from 2005: https://thedailywtf.com/articles/Classic-WTF-What-Is-Truth )
cataphract | 6 hours ago
What I was surprised with was that their union code was valid. I thought accessing a union member that was not active was valid in C, but not in C++.
matthewkayin | 4 hours ago
I am not a C++ expert, but I'm surprised to hear that it is considered UB to access the other member since as far as I can tell a union is just a chunk of memory that can be interpreted differently depending on which union member you access?
So, if you had a union { bool b, char c }, and you set b = false, then I would think that accessing c would predictably give you a value of '\0'.
Granted, you probably shouldn't go around accessing an inactive union member like that, but when people say it's UB they make it sound like it's impossible to guarantee what data will be inside that byte, and it seems to me like that isn't true.
MountainTheme12 | 7 hours ago
ux266478 | 6 hours ago
anematode | 6 hours ago
Actually I can't think of a single realistic use case for this; despite the constant trumpeting of C++ folks on how constexpr will save C++, constant expressions are just too limited for this to matter.
irishcoffee | 7 hours ago
pif | 7 hours ago
worldsavior | 7 hours ago
alphazard | 6 hours ago
Unless you have a system that says "no" a lot, and occasionally removes features, programming languages decay, and the game has been (historically, before LLMs) to pick a language that would be in the sweet spot for the time that you need to use it, while keeping your eye out for something else to switch to once it becomes sufficiently unusable.
raincole | 6 hours ago
lights0123 | 6 hours ago
They do publish removal plans years in advance, e.g. see Python 3.17's plans: https://docs.python.org/3/deprecations/pending-removal-in-3....
jasode | 6 hours ago
https://en.wikipedia.org/wiki/C%2B%2B11#Features_removed_or_...
https://en.wikipedia.org/wiki/C%2B%2B17#Removed_features
https://en.wikipedia.org/wiki/C%2B%2B20#Removed_and_deprecat...
https://en.wikipedia.org/wiki/C%2B%2B23#Removed_features_and...
pjmlp | 6 hours ago
.NET, the whole .NET Framework to modern (core) .NET migration, left several parts behind, the foor loop semantics change on C#, introduction of field keyword, and with.
hypeatei | 6 hours ago
simonask | 6 hours ago
armchairhacker | 6 hours ago
alphazard | 5 hours ago
Suspiciously, after Rob Pike retired from the project, the amount of language and standard library changes skyrocketed. Many people now trying to get their thing into the language so they can add it to their list of accomplishments.
Clear evidence that you need someone saying "no" often.
jcranmer | 5 hours ago
Dear lord that is not the case. The C++ standardization process is extremely different from Rust's specification process, and the resulting pathologies are extremely dissimilar. Hell, C is fairly close to C++ in terms of process, and yet it still has its own set of pathologies.
The C++ committee is dominated not by experts on compiler implementation, but by people looking to get their own proposals incorporated into the standard, and is structurally organized in such a way that it can be difficult for any group to feel empowered to actually reject a feature. It should be noted that in the most recent batch of C++ papers, there was effectively an implementers' revolt: https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2026/p39....
The Rust proposal process is much more ponderous, and when you take into account the lag between an accepted RFC and implementation and stabilization (and the fact that some accepted RFCs turn out to be unworkable and effectively get backed out without ever being stabilized), it's pretty clear that the actual development process is night-and-day different. For example, the Try trait in Rust still has yet to be stabilized, despite the RFC proposing it being introduced over nine years ago and a v2 RFC being accepted five years ago.
alphazard | 4 hours ago
I don't care about the implementation process or the RFCs or what-have-you. If there is a democratic committee of humans that decides what goes in, and there is no bias for minimalism (e.g. 1/3 could strike down a proposal instead of 1/2) then the process will tend towards bloat.
jcranmer | 3 hours ago
But sure, keep on saying they're basically the same thing.
Blackthorn | 6 hours ago
Other than that, there's always an interesting psychology at play in software engineering but it really seems to come out when people talk about C++ for some reason. Complexity is just needless bloat when it's a feature you aren't using, and it's an essential part of the language when it's a feature you are.
PaulDavisThe1st | 6 hours ago
tw1984 | 7 hours ago
41 years after its invention, C++ still doesn't have networking support in its stdlib. excuses after excuses, they have millions justifications on why the stdlib doesn't need networking. but in the same time, some bureaucratic "committee members" struggling with their midlife crisis want you to waste your life on stuff like Std:Is_within_lifetime in the era of AI.
what a bloody load of joke!
Can't wait to see some high accurate coding agents start being able to port C++ code to rust with minimum human interventions to liberate people from the most bureaucratic nonsense in CS history. Some AI native language incorporated with concepts that were too complicated for human would be even better.
it has never been a better time to depreciate dinosaurs like C++!
ethin | 6 hours ago
tialaramex | 6 hours ago
1: Some of networking is vocabulary and so it obviously should live in your stdlib, and indeed for C++ it should be in what they call "freestanding", like Rust's core, where core::net::IPv4Addr lives. It is very silly if the software in this cheap embedded device and this PC web browser can't even agree on what an IP address is in 2026.
2: In practice the C++ stdlib is used as a dumping ground for stuff that ought to live in a package manager if C++ was a good language. That was true when networking was first proposed and it's still true now. It's why RCU and Hive are both in C++ 26. Those aren't vocabulary, and they aren't needed by the vast majority of programmers, but their proponents wanted them to be available out of the box and in C++ that means they must live in the stdlib.
swiftcoder | 6 hours ago
Networking is defacto in the standard library, because C++ standard library is almost always supplemented by whatever C functionality is lying around, and POSIX networking exists.
That they haven't felt the need to provide a more useful abstraction on top of the POSIX layer (and hey, maybe abstract over Microsoft's variant in the process) in the past 3 decades does seem like a miss
ksherlock | 6 hours ago
dzdt | 7 hours ago
raincole | 7 hours ago
It'd be funny if it ends up being just C++35.
ainiriand | 6 hours ago
ksherlock | 6 hours ago
FuckButtons | 5 hours ago
cbm-vic-20 | 6 hours ago
https://shakespearelang.com/
pjmlp | 6 hours ago
C Source code => Tradicional UNIX C compiler => ASM => object file
Now everyone is doing
AI tooling => C Source code => Tradicional UNIX C compiler => ASM => object file
For all pratical purposes, just like using a language like Nim, the workflow exposed to user can hide the middle steps.
Then there is the other take, if you start using agents that can be configured to do tool calling, it is hardly any different from low code applications, doing REST/GraphQL/gRPC calls orchestrated via flow charts, which is exactly what iPaSS tooling are offering nowadays, like Workato, Boomi,...
taminka | 6 hours ago
secondcoming | 6 hours ago
anjellow | 6 hours ago
estimator7292 | 6 hours ago
For most of us it seems you can get good at C++ or metaprogramming. But unless you want to make it your entire career you can't really do both with the same degree of effectiveness.
I really like C++, and I will probably continue using it forever. But really only the very small subset of the language that applies to my chosen field. I'm a "C with classes" kind of guy and templates and constexpr are pretty rare. Hell, half the time I don't even have stdlib on embedded platforms. It's kind of nice, actually.
Blackthorn | 6 hours ago
PaulDavisThe1st | 6 hours ago
anematode | 6 hours ago
jeffbee | 5 hours ago
anematode | 4 hours ago
dataflow | 5 hours ago
Like everything else in the world, in general, it depends.
That said, among the limited number of times I've tried this, I don't recall a single case where I felt it would be worth it and it turned out to be detrimental.
PaulDavisThe1st | 4 hours ago
1. this is being used in a method that is widely used in both the <true> and <false> contexts, which I believe means that branch prediction would not be great if it was simply the same instruction sequence in both contexts. I could be wrong about that.
2. the major benefit that I saw was not duplicating the much more substantial code in the "..." sections (before and after the constexpr) and thus making maintainance and continued evolution of the code more reliable.
omoikane | 3 hours ago
Anecdotally, I remember having to review a library written in C++98. It actually worked as promised, but it also did a lot of extremely clever things that were sort of unnecessary if we had just waited for C++11 with type_traits. We got rid of that library later after rewriting all the downstream dependencies.
amelius | 6 hours ago
aeve890 | 6 hours ago
>It is — and it totally makes sense.
Average C++ enyojer coping mechanism on C++ trend of naming things with the most convoluted (and often wrong) way possible. std::vector, std::monostate, std::unit, etc.