

The teeth regrowth thing was only two years ago.


The teeth regrowth thing was only two years ago.


Whatever % of people that “fail” this test, is much higher than the 0% of people that would do so using rusts’ compiler.
Of course, programs that don’t pass the borrow checker can be totally memory safe, but that would need to be analyzed on a case by case basis.
Programs that do pass the borrow checker aren’t guaranteed to be totally memory safe, so the number isn’t actually 0% for Rust either: https://github.com/Speykious/cve-rs


Rust forces you to do this until you have to use unsafe, after which it doesn’t. That is not so different from C++ guaranteeing your safety until you start using raw pointers.
It is not the compiler’s job to stop the programmer from shooting themselves in the foot if they want to. It’s the compiler’s job to make it clear to the programmer when they disable the safety, put their finger on the trigger and aim the gun at their foot. Modern C++ does this, and if you still inadvertedly shoot yourself in the foot in spite of the warnings, you brought it on yourself.
Regular old C, on the other hand, gives you a 9mm when you’re in grade 7, safety: always off.


If you follow modern C++ best practices, memory unsafety will not happen by accident. The dodgy stuff in modern, idiomatic C++ is immediately obvious.


It’s not a joke. What was described above is pretty much C++'s RAII pattern, which Rust evangelists love to present as a revolutionary Rust invention. Used with smart pointers, it will help avoid use-after-frees. What it doesn’t avoid is null pointer exceptions (you can std::move a unique_ptr and still access it, it’ll just be nullptr), but those will typically “just” be a crash rather than a gaping security hole.
That is not to say Rust doesn’t have its own innovations on top of that (notably that the compiler stringently enforces this pattern), and C++ does give you many more ways to break the rules and shoot yourself in the foot than Rust does.


They will probably have a version newer than 5.15.149.


It looks to be on 6.1.153 currently which is much newer than 6.1.76.


With the uname -a command


This only affects positively ancient kernels:
From (including) 3.15 Up to (excluding) 5.15.149 From (including) 6.1 Up to (excluding) 6.1.76 From (including) 6.2 Up to (excluding) 6.6.15 From (including) 6.7 Up to (excluding) 6.7.3


C++ would also solve this for the same reason!!


I mean, the number of logical qubits has gone from basically zero not too long ago to what it is now. The whole error correction thing has really only taken off in the past ~5 years. That Microsoft computer you mentioned that got 4 logical qubits out of 30 physical qubits represents a 3-fold increase over the apparently previous best of 12 logical qubits to 288 physical ones (published earlier the same year), which undoubtedly was a big improvement over whatever they had before.
And then the question is FOR WHAT? Dead people cant make use of quantum computers and dead people is what we will be if we dont figure out solutions to some much more imminent, catastrophic problems in the next 10 years.
Strange thing to say. There’s enough people on the planet to work on more than one problem at a time. Useful quantum computing will probably help solve many problems in the future too.


Even if it’s 8 physical qubits to 1 logical qubit, 6100 qubits would get you 762 logical cubits.
All I’m saying is that the technology seems to be on a trajectory of the number of qubits improving by an order of magnitude every few years, and as such it’s plausible that in another 5-10 years it could have the necessary thousands of logical qubits to start doing useful computations. Mere 5 years ago the most physical qubits in a quantum computer was still measured in the tens rather than the hundreds, and 10 years ago I’m pretty sure they hadn’t even broken ten.


We can only hope that Bitcoin gets pwned by quantum computers. It would be absolutely glorious.


There was a paper recently about a stable 6100-qubit system, so the trajectory is plausible. If 1399 qubits is needed for 2048-bit Shor’s, this would already meet that by a wide margin – though obviously this is a research system that AFAIK cannot do actual computations.
Thank you Mr ChatGPT