Josh Aas, co-founder and executive director of the Internet Security Research Group (ISRG), which oversees a memory safety initiative called Prossimo, last year told The Register that while it’s theoretically possible to write memory-safe C++, that’s not happening in real-world scenarios because C++ was not designed from the ground up for memory safety.
That baseless claim doesn’t pass the smell check. Just because a feature was not rolled out in the mid-90s would that mean that it’s not available today? Utter nonsense.
If your paycheck is highly dependent on pushing a specific tool, of course you have a vested interest in diving head-first in a denial pool.
Currently their statement (regardless of the questionable justification) is largely correct, no major c++ projects have been written in a safe subset and no real work has really started yet. It isn’t practical.
I do agree with you that a safe form of c++, once fully implemented and not frustrating to use, could easily become viable, the feature can be added. But that’s still years away from practical usage in large project, and even when done, many projects will stick to the older forms, making the transition slow and frustrating.
The practical result is that he’s sort of right, if you just add the word “currently” to his statement.
Otoh, I do agree with you that rust cannot be the sole answer to this problem either, it’s almost as impractical to rewrite codebases in rust as an as-yet unfinished safe form of C++. Only time and lots of effort can fix this problem
The only (arguably*) baseless claim in that quote is this part:
You do understand you’re making that claim on the post discussing the proposal of Safe C++ ?
And to underline the absurdity of your claim, would you argue that it’s impossible to write a"hello, world" program in C++ that’s not memory-safe? From that point onward, what would it take to make it violate any memory constraints? Are those things avoidable? Think about it for a second before saying nonsense about impossibilities.
A fictional safe C++ that would inevitably break backwards compatibility might as well be called Noel++, because it’s not the same language anymore.
If that proposal ever gets implemented (it won’t), neither the promise of guaranteed memory safety will hold up, nor any big C++ project will adopt it. Big projects don’t adopt the (rollingly defined) so-called modern C++ already, and that is something that is a part of the language proper, standardized, and available via multiple implementations.
would you argue that it’s impossible to write a"hello, world" program in C++
bent as expected
This proposal is just a part of a damage control campaign. No (supposedly doable) implementation will ever see the light of day. Ping me when this is proven wrong.
Just because a feature was not rolled out in the mid-90s would that mean that it’s not available today?
Adding a feature is one thing, C++ has added a lot of memory safety features over the years. The problem with C++ is it still allows a lot of unsafe ways of working with memory that previous projects used and people still use now. Removing support for these features will break existing code and piss a lot of people off in the process. It is not about adding new features, but removing the unsafe existing features that they are talking about here.
The problem with C++ is it still allows a lot of unsafe ways of working with memory that previous projects used and people still use now.
Why do you think this is a problem? We have a tool that gives everyone the freedom to manage resources the way it suits their own needs. It even went as far as explicitly supporting garbage collectors right up to C++23. Some frameworks adopted and enforced their own memory management systems, such as Qt.
Tell me, exactly why do you think this is a problem?
It’s not just that. Debugging segfaults and UB can be an absolute nightmare.
The C++ committee still haven’t learnt their lesson. I recently learnt about C++20 coroutines, which are pretty neat, if complex (there are pretty much no good learning resources about them). However they are still putting unnecessary UB footguns in it.
Reminds me of how I found some safety measures to be in China some years back, basically those were signs saying “plz don’t fall to your death, if you do it’s your fault”
If you could reliably write memory safe code in C++, why do devs put memory safety issues intontheir code bases then?
Even highly paid (and probably skilled) devs in the IT industry manage to mess that up pretty regularly. Even if it was: devs using memory safe languages make much fewer mistakes wrt. managing memory… so that tooling does seem to help them at least more than the C++ tooling helps the C++ devs.
If you could reliably write memory safe code in C++, why do devs put memory safety issues intontheir code bases then?
That’s a question you can ask to the guys promoting the adoption of languages marketed based on memory safety arguments. I mean, even Rust has a fair share of CVEs whose root cause is unsafe memory management.
From the article.
That baseless claim doesn’t pass the smell check. Just because a feature was not rolled out in the mid-90s would that mean that it’s not available today? Utter nonsense.
If your paycheck is highly dependent on pushing a specific tool, of course you have a vested interest in diving head-first in a denial pool.
But cargo cult mentality is here to stay.
I think your take is a bit extreme.
Currently their statement (regardless of the questionable justification) is largely correct, no major c++ projects have been written in a safe subset and no real work has really started yet. It isn’t practical.
I do agree with you that a safe form of c++, once fully implemented and not frustrating to use, could easily become viable, the feature can be added. But that’s still years away from practical usage in large project, and even when done, many projects will stick to the older forms, making the transition slow and frustrating.
The practical result is that he’s sort of right, if you just add the word “currently” to his statement.
Otoh, I do agree with you that rust cannot be the sole answer to this problem either, it’s almost as impractical to rewrite codebases in rust as an as-yet unfinished safe form of C++. Only time and lots of effort can fix this problem
The only (arguably*) baseless claim in that quote is this part:
Maybe try to write more humbly and less fanatically, since you don’t seem to be that knowledgable about anything (experienced in other threads too).
* It’s “theoretically possible” to write memory-safe assembly if we bend contextual meanings enough.
You do understand you’re making that claim on the post discussing the proposal of Safe C++ ?
And to underline the absurdity of your claim, would you argue that it’s impossible to write a"hello, world" program in C++ that’s not memory-safe? From that point onward, what would it take to make it violate any memory constraints? Are those things avoidable? Think about it for a second before saying nonsense about impossibilities.
bent as expected
This proposal is just a part of a damage control campaign. No (supposedly doable) implementation will ever see the light of day. Ping me when this is proven wrong.
Adding a feature is one thing, C++ has added a lot of memory safety features over the years. The problem with C++ is it still allows a lot of unsafe ways of working with memory that previous projects used and people still use now. Removing support for these features will break existing code and piss a lot of people off in the process. It is not about adding new features, but removing the unsafe existing features that they are talking about here.
Why do you think this is a problem? We have a tool that gives everyone the freedom to manage resources the way it suits their own needs. It even went as far as explicitly supporting garbage collectors right up to C++23. Some frameworks adopted and enforced their own memory management systems, such as Qt.
Tell me, exactly why do you think this is a problem?
All the use after free and buffer overflow bugs that plague our key infrastructure.
It’s not just that. Debugging segfaults and UB can be an absolute nightmare.
The C++ committee still haven’t learnt their lesson. I recently learnt about C++20 coroutines, which are pretty neat, if complex (there are pretty much no good learning resources about them). However they are still putting unnecessary UB footguns in it.
Maybe a “pragma strict” where every deprecated is an error and not a warning?
Reminds me of how I found some safety measures to be in China some years back, basically those were signs saying “plz don’t fall to your death, if you do it’s your fault”
If you could reliably write memory safe code in C++, why do devs put memory safety issues intontheir code bases then?
Even highly paid (and probably skilled) devs in the IT industry manage to mess that up pretty regularly. Even if it was: devs using memory safe languages make much fewer mistakes wrt. managing memory… so that tooling does seem to help them at least more than the C++ tooling helps the C++ devs.
That’s a question you can ask to the guys promoting the adoption of languages marketed based on memory safety arguments. I mean, even Rust has a fair share of CVEs whose root cause is unsafe memory management.