I don’t think that casting a range of bits as some other arbitrary type “is a bug nobody sees coming”.
C++ compilers also warn you that this is likely an issue and will fail to compile if configured to do so. But it will let you do it if you really want to.
That’s why I love C++
Why use a strongly typed language at all, then?
Sounds unnecessarily restrictive, right? Just cast whatever as whatever and let future devs sort it out.
$myConstant = ‘15’;
$myOtherConstant = getDateTime();
$buggyShit = $myConstant + $myOtherConstant;Fuck everyone who comes after me for the next 20 years.
The problem is that it’s undefined behavior. Quake fast inverse square root only works before the types just happen to look that way. Because the floats just happens to have that bit arrangement. It could look very different on other machines! Nevermind that it’s essentially always exactly the same on most architectures. So yeah. Undefined behavior is there to keep your code usable even if our assumptions about types and memory change completely one day.
Aand what is wrong with that?
“C++ compilers also warn you…”
Ok, quick question here for people who work in C++ with other people (not personal projects). How many warnings does the code produce when it’s compiled?
I’ve written a little bit of C++ decades ago, and since then I’ve worked alongside devs who worked on C++ projects. I’ve never seen a codebase that didn’t produce hundreds if not thousands of lines of warnings when compiling.
My team uses the -Werror flag, so our code won’t compile if there are any warnings at all.
A production code should never have any warning left. This is a simple rule that will save a lot of headaches.
Neither should your development code, except for the part where you’re working on.
I put -Werror at the end of my makefile cflags so it actually treats warnings as errors now.
I mostly see warnings when compiling source code of other projects. If you get a warning as a dev, it’s your responsibility to deal with it. But also your risk, if you don’t. I made it a habit to fix every warning in my own projects. For prototyping I might ignore them temporarily. Some types of warnings are unavoidable sometimes.
If you want to make yourself not ignore warnings, you can compile with
-Werror
if using GCC/G++ to make the compiler a pedantic asshole that doesn’t compile until you fix every fucking warning. Not advisable for drafting code, but definitely if you want to ship it.Except when you have to cast size_t on int and vice versa (for “small” numbers). I hate that warning.
0 in our case, but we are pretty strict. Same at the first place I worked too. Big tech companies.
You shouldn’t have any warnings. They can be totally benign, but when you get used to seeing warnings, you will not see the one that does matter.
I know, that’s why it bothered me that it seemed to be “policy” to just ignore them.
Ideally? Zero. I’m sure some teams require “warnings as errors” as a compiler setting for all work to pass muster.
In reality, there’s going to be odd corner-cases where some non-type-safe stuff is needed, which will make your compiler unhappy. I’ve seen this a bunch in 3rd party library headers, sadly. So it ultimately doesn’t matter how good my code is.
There’s also a shedload of legacy things going on a lot of the time, like having to just let all warnings through because of the handful of places that will never be warning free. IMO its a way better practice to turn a warning off for a specific line.. Sad thing is, it’s newer than C++ itself and is implementation dependent, so it probably doesn’t get used as much.
I’ve seen this a bunch in 3rd party library headers, sadly. So it ultimately doesn’t matter how good my code is.
Yeah, I’ve seen that too. The problem is that once the library starts spitting out warnings it’s hard to spot your own warnings.
Yuuup. Makes me wonder if there’s a viable “diaper pattern” for this kind of thing. I’m sure someone has solved that, just not with the usual old-school packaging tools (e.g. automake).
deleted by creator
Depends on the age of the codebase, the age of the compiler and the culture of the team.
I’ve arrived into a team with 1000+ warnings, no const correctness (code had been ported from a C codebase) and nothing but C style casts. Within 6 months, we had it all cleaned up but my least favourite memory from that time was “I’ll just make this const correct; ah, right, and then this; and now I have to do this” etc etc. A right pain.
So, did you get it down to 0 warnings and manage to keep it there? Or did it eventually start creeping up again?
I’m not the person you’re asking but surely they just told the compiler to treat warnings as errors after that. No warnings can creep in then!
Ignoring warnings is really not a good way to deal with it. If a compiler is bitching about something there is a reason to.
A lot of times the devs are too overworked or a little underloaded in the supply of fucks to give, so they ignore them.
In some really high quality codebases, they turn on “treat warnings as errors” to ensure better code.
I know that should be the philosophy, but is it? In my experience it seems to be normal to ignore warnings.
I work on one of the larger c++ projects out there (20 to 50 million lines range) and though I don’t see the full build logs I’ve yet to see a component that has a warning.
I’m all for having the ability to do these shenanigans in principle, but prefer if they are guarded in an
unsafe
block.C lets you shoot yourself in the foot.
C++ lets you reuse the bullet.
C is dangerous like your uncle who drinks and smokes. Y’wanna make a weedwhacker-powered skateboard? Bitchin’! Nail that fucker on there good, she’ll be right. Get a bunch of C folks together and they’ll avoid all the stupid easy ways to kill somebody, in service to building something properly dangerous. They’ll raise the stakes from “accident” to “disaster.” Whether or not it works, it’s gonna blow people away.
C++ is dangerous like a quiet librarian who knows exactly which forbidden tomes you’re looking for. He and his… associates… will gladly share all the dark magic you know how to ask about. They’ll assure you that the power cosmic would never, without sufficient warning, pull someone inside-out. They don’t question why a loving god would allow the powers you crave. They will show you which runes to carve, and then, they will hand you the knife.
There are no medals waiting for you by writing overly clever code. Trust me, I’ve tried. There’s no pride. Only pain.
Debugging code is always harder that writing it in the first place. If you make it as clever as you can, you won’t be clever enough to debug it.
Some junior will call it ”skill issues” and then write the most unreadable code ever.
It really depends on your field. I’m doing my master’s thesis in HPC, and there, clever programming is really worth it.
Well as long you know what you’re doing and weigh the risks with the benefits you’re probably ok.
In my experience in the industry, there’s little benefit in pretending you’re John Carmack writing fast inverse square root. Understanding what you wrote 6 months ago outweighs most else.
Clever as in elegantly and readable or clever as in a hack that abuses a bug/feature and you need to understand the intricacies to understand half of it?
Honestly, also the latter. If you are using hundreds of thousands of cores for over 100h, every single second counts.
Not only that, but everyone who sees that code later is going to waste so much time trying to understand it. That includes future you.
That what comments and documentation are for.
A yes, comments.
int flubTheWozat(void *) { for (int i=0; i<4; i++) { lfens += thzn[i] % ugy; // take mod of thnz[i] with ugy and add to lefens. } return (lfens % thzn[0]) == 4; // return if it's 4ish }
Haha, meaningful, informative comments that make it easier to understand the code of course. ;)
But I must o p t i m i z e! ó_ò
Yes, let’s spend two hours on figuring out optimal values of preallocating a vector for your specific use-case. It’s worth the couple of microseconds saved! Kleinvieh macht auch Mist.
As it should be. Airbags should go off when you crash, not when you drive near the edge of a cliff.
What do you mean I’m not supposed to add 0x5f3759df to a float casted as a long, bitshifted right by 1?
//what the fuck?
They know. It’s a comment from the code.
But it will let you do it if you really want to.
Now, I’ve seen this a couple of times in this post. The idea that the compiler will let you do anything is so bizarre to me. It’s not a matter of being allowed by the software to do anything. The software will do what you goddamn tell it to do, or it gets replaced.
WE’RE the humans, we’re not asking some silicon diodes for permission. What the actual fuck?!? We created the fucking thing to do our bidding, and now we’re all oh pwueez mr computer sir, may I have another ADC EAX, R13? FUCK THAT! Either the computer performs like the tool it is, or it goes the way of broken hammers and lawnmowers!
New copypasta just dropped
Yeah, but there’s some things computers are genuinely better at than humans, which is why we code in the first place. I totally agree that you shouldn’t be completely controlled by your machine, but strong nudging saves a lot of trouble.
I understand the idea. But many people have hugely mistaken beliefs about what the C[++] languages are and how they work. When you write ADC EAX, R13 in assembly, that’s it. But C is not a “portable assembler”! It has its own complicated logic. You might think that by writing ++i, you are writing just some INC [i] ot whatnot. You are not. To make a silly example, writing
int i=INT_MAX; ++i;
you are not telling the compiler to produce INT_MIN. You are just telling it complete nonsense. And it would be better if the compiler “prevented” you from doing it, forcing you to explain yourself better.I get what you’re saying. I guess what I’m yelling at the clouds about is the common discourse more than anything else.
If a screw has a slotted head, and your screwdriver is a torx, few people would say that the screwdriver won’t allow them to do something.
Computers are just tools, and we’re the ones who created them. We shouldn’t be submissive, we should acknowledge that we have taken the wrong approach at solving something and do it a different way. Just like I would bitch about never having the correct screwdriver handy, and then go look for the right one.
Yup, I am with you on this one
Soldiers are supposed to question potentially-illegal orders and refuse to execute them if their commanding officer can’t give a good reason why they’re justified. Being in charge doesn’t mean you’re infallible, and there are plenty of mistakes programmers make that the compiler can detect.
I get the analogy, but I don’t think that it’s valid. Soldiers are, much to the chagrin of their commanders, sentient beings, and should question potentially illegal orders.
Where the analogy doesn’t hold is, besides my computer not being sentient, what I’m prevented from doing isn’t against the law of man.
I’m not claiming to be infallible. After all to err is human, and I’m indeed very human. But throw me a warning when I do something that goes against best practices, that’s fine. Whether I deal with it is something for me to decide. But stopping me from doing what I’m trying to do, because it’s potentially problematic? GTFO with that kinda BS.
Ok gramps now take your meds and off you go to the retirement home
Stupid cloud, who’s laughing now?
I will botton for my rust compiler, I’m not going to argue with it.
when life gives you restrictive compilers, don’t request permission from them! make life take the compilers back! Get mad! I don’t want your damn restrictive compilers, what the hell am I supposed to do with these? Demand to see life’s manager! Make life rue the day it thought it could give BigDanishGuy restrictive compilers! Do you know who I am? I’m the man who’s gonna burn your house down! With the compilers! I’m gonna get my engineers to invent a combustible compiler that burns your house down!
This comment makes me want to reformat every fucking thing i use and bend it to -my- will like some sort of technomancer
I actually do like that C/C++ let you do this stuff.
Sometimes it’s nice to acknowledge that I’m writing software for a computer and it’s all just bytes. Sometimes I don’t really want to wrestle with the ivory tower of abstract type theory mixed with vague compiler errors, I just want to allocate a block of memory and apply a minimal set rules on top.
100%. In my opinion, the whole “build your program around your model of the world” mantra has caused more harm than good. Lots of “best practices” seem to be accepted without any qualitative measurement to prove it’s actually better. I want to think it’s just the growing pains of a young field.
Even with qualitative measurements they can do stupid things.
For work I have to write code in C# and Microsoft found that null reference exceptions were a common issue. They actually calculated how much these issues cost the industry (some big number) and put a lot of effort into changing the language so there’s a lot of warnings when something is null.
But the end result is people just set things to an empty value instead of leaving it as null to avoid the warnings. And sure great, you don’t have null reference exceptions because a value that defaulted to null didn’t get set. But now you have issues where a value is an empty string when it should have been set.
The exception message would tell you exactly where in the code there’s a mistake, and you’ll immediately know there’s a problem and it’s more likely to be discovered by unit tests or QA. Something that’s an value that’s supposed to be set may not be noticed for a while and is difficult to track down.
So their research indicated a costly issue (which is ultimately a dev making a mistake) and they fixed it by creating an even more costly issue.
There’s always going to be things where it’s the responsibility of the developer to deal with, and there’s no fix for it at the language level. Trying to fix it with language changes can just make things worse.
For this example, I feel that it is actually fairly ergonomic in languages that have an
Option
type (like Rust), which can either beSome
value or no value (None
), and don’t normally havenull
as a concept. It normalizes explicitly dealing with the None instead of havingnull
or hidden empty strings and such.I just prefer an exception be thrown if I forget to set something so it’s likely to happen as soon as I test it and will be easy to find where I missed something.
I don’t think a language is going to prevent someone from making a human error when writing code, but it should make it easy to diagnose and fix it when it happens. If you call it null, “”, empty, None, undefined or anything else, it doesn’t change the fact that sometimes the person writing the code just forgot something.
Abstracting away from the problem just makes it more fuzzy on where I just forgot a line of code somewhere. Throwing an exception means I know immediately that I missed something, and also the part of the code where I made the mistake. Trying to eliminate the exception doesn’t actually solve the problem, it just hides the problem and makes it more difficult to track down when someone eventually notices something wasn’t populated.
Sometimes you want the program to fail, and fail fast (while testing) and in a very obvious way. Trying to make the language more “reliable” instead of having the reliability of the software be the responsibility of the developer can mean the software always “works”, but it doesn’t actually do what it’s supposed to do.
Is the software really working if it never throws an exception but doesn’t actually do what it’s supposed to do?
It is fair to have a preference for exceptions. It sounds like there may be a misunderstanding on how
Option
works.Have you used languages that didn’t have
null
and hadOption
instead? If we look at Rust, you can’t forget not to check it: it is impossible to get theSome
of anOption
without dealing with theNone
. You can’t forget this. You can mess up in a lot of other ways, but you explicitly have to decide how to handle that potentialNone
case.If you want it to fail fast and obvious, there are ways to do this. For example you, you can use the
unwrap()
method to get the containedSome
value or panic if it isNone
,expect()
to do the same but with a custom panic message, the?
operator to get the containedSome
value or return the function withNone
, etc. Tangentially, these also work forResult
, which can beOk
orErr
.It is pretty common to use these methods in places where you always want to fail somewhere that you don’t expect should have a
None
or where you don’t want your code to deal with the consequences of something unexpected. You have decided this and live with the consequences, instead of it implicitly happening/you forgetting to deal with it.
People just think that applying arbitrary rules somehow makes software magically more secure, like with rust, as if the compiler won’t just “let you” do the exact same fucking thing if you type the
unsafe
keyworddeleted by creator
That’s not what I meant. I understand that rust forces things to be more secure. It’s not not like there’s some guarantee that rust is automatically safe, and C++ is automatically unsafe.
deleted by creator
No there is not. Borrow checking and RAII existed in C++ too and there is no formal axiomatic proof of their safety in a general sense. Only to a very clearly defined degree.
In fact, someone found memory bugs in Rust, again, because it is NOT soundly memory safe.
Dart is soundly Null-safe. Meaning it can never mathematically compile null unsafe code unless you explicitly say you’re OK with it. Kotlin is simply Null safe, meaning it can run into bullshit null conditions.
The same thing with Rust: don’t let it lull you into a sense of security that doesn’t exist.
Borrow checking…existed in C++ too
Wat? That’s absolutely not true; even today lifetime-tracking in C++ tools is still basically a research topic.
…someone found memory bugs in Rust, again, because it is NOT soundly memory safe.
It’s not clear what you’re talking about here. In general, there are two ways that a language promising soundness can be unsound: a bug in the compiler, or a problem in the language definition itself permitting unsound code. (
unsafe
changes the prerequisites for unsoundness, placing more burden on the user to ensure that certain invariants are upheld; if the code upholds these invariants, but there’s still unsoundness, then that falls into the “bug in Rust” category, but unsoundness of incorrectunsafe
code is not a bug in Rust.)Rust has had both types of bugs. Compiler bugs can be (and are) fixed without breaking (correct) user code. Bugs in the language definition are, fortunately, fixable at edition boundaries (or in rare cases by making a small breaking change, as when the behavior of
extern "C"
changed).Have you heard about cve-rs?
https://github.com/Speykious/cve-rs
Blazingly fast memory failures with no unsafe blocks in pure Rust.
Edit: also I wish whoever designed the syntax for rust to burn in hell for eternity
Edit 2: Before the Cult of Rust™ sends their assassins to take out my family, I am not hating on Rust (except the syntax) and I’m not a C absolutist, I am just telling you to be aware of the limitations of your tools
I want you to imagine that your comments in this thread were written by an engineer or a surgeon instead of a programmer.
Imagine an engineer saying “Sure, you can calculate the strength of a bridge design based on known material properties and prove that it can hold the design weight, it that doesn’t automatically mean that the design will be safer than one where you don’t do that”. Or “why should I have to prove that my design is safe when the materials could be defective and cause a collapse anyway?”
Or a surgeon saying “just because you can use a checklist to prove that all your tools are accounted for and you didn’t leave anything inside the patient’s body doesn’t mean that you’re going to automatically leave something in there if you don’t have a checklist”. Or “washing your hands isn’t a guarantee that the patient isn’t going to get an infection, they could get infected some other way too”.
A doctor or engineer acting like this would get them fired, sued, and maybe even criminally prosecuted, in that order. This is not the mentality of a professional, and it is something that programming as a profession needs to grow out of.
“washing your hands isn’t a guarantee that the patient isn’t going to get an infection, they could get infected some other way too”.
Every single doctor should know this yes.
It seems people are adding a sentence I didn’t say “rust can be unsafe and thus we shouldn’t try” on top of the one I did say “programmers should be aware that rust doesn’t automatically mean safe”.
You didn’t say “programmers should be aware that rust doesn’t automatically mean safe”. You said:
People just think that applying arbitrary rules somehow makes software magically more secure…
You then went on to mention
unsafe
, conflating “security” and “safety”; Rust’s guarantees are around safety, not security, so it sounds like you really mean “more safe” here. But Rust does make software more safe than C++: it prohibits memory safety issues that are permitted by C++.You then acknowledged:
I understand that rust forces things to be more secure
…which seems to be the opposite of your original statement that Rust doesn’t make software “more secure”. But in the same comment:
It’s not not like there’s some guarantee that rust is automatically safe…
…well, no, there IS a guarantee that Rust is “automatically” (memory) safe, and to violate that safety, your program must either explicitly opt out of that “automatic” guarantee (using
unsafe
) or exploit (intentionally or not) a compiler bug.…and C++ is automatically unsafe.
This is also true! “Safety” is a property of proofs: it means that a specific undesirable thing cannot happen. The C++ compiler doesn’t provide safety properties[1]. The opposite of “safety” is “liveness”, meaning that some desirable thing does happen, and C++ does arguably provide certain liveness properties, in particular RAII, which guarantees that destructors will be called when leaving a call-stack frame.
[1] This is probably over-broad, but I can’t think of any safety properties C++ the language does provide. You can enfor your own safety properties in library code, and the standard library provides some; for instance, mutexes have safety guarantees.
Then you should probably be a little more explicit about that, because I have never, not once in my life, heard someone say “well you know wearing a seatbelt doesn’t guarantee you’ll survive a car crash” and not follow it up with “that’s why seatbelts are stupid and I’m not going to wear one”.
We need to stop attaching shit someone doesn’t say to something they did. It makes commutating hostile and makes you an asshole.
Edit: okay that was a bit rude. But it’s so frustrating to say something and then have other people go “that means <this other thing you didn’t say>!!!11!”
It’s neither arbitrary nor magic; it’s math. And
unsafe
doesn’t disable the type system, it just lets you dereference raw pointers.You don’t even need
unsafe
, you can just take user input and execute it in a shell and rust will let you do it. Totally insecure!Rust isn’t memory safe because you can invoke another program that isn’t memory safe?
My comment is sarcastic, obviously. The argument Kairos gave is similar to this. You can still introduce vulnerabilities. The issue is normally that you introduce them accidentally. Rust gives you safety, but does not put your code into a sandbox. It looked to me like they weren’t aware of this difference.
You don’t need
unsafe
to write vulnerable code in rust.Yes I know there are other ways to do it. That’s one way.
I don’t know rust, but for example in Swift the type system can make things way more difficult.
Before they added macros if you wanted to write ORM code on a SQL database it was brutal, and if you need to go into raw buffers it’s generally easier to just write C/objc code and a bridging header. The type system can make it harder to reason about performance too because you lose some visibility in what actually gets compiled.
The Swift type system has improved, but I’ve spent a lot of time fighting with it. I just try to avoid generics and type erasure now.
I’ve had similar experiences with Java and Scala.
That’s what I mean about it being nice to drop out of setting up some type hierarchy and interfaces and just working with a raw buffers or function pointers.
Did you know 100% of C programmers were sex offenders?
I used to love C++ until I learned Rust. Now I think it is obnoxious, because even if you write modern C++, without raw pointers, casting and the like, you will be constantly questioning whether you do stuff right. The spec is just way too complicated at this point and it can only get worse, unless they choose to break backwards compatibility and throw out the pre C++11 bullshit
Depending on what I’m doing, sometimes rust will annoy me just as much. Often I’m doing something I know is definitely right, but I have to go through so much ceremony to get it to work in rust. The most commonly annoying example I can think of is trying to mutually borrow two distinct fields of a struct at the same time. You can’t do it. It’s the worst.
I suppose it’s a matter of experience and practise. The more you wotk with it the better you get. As usual with all things one can learn.
The question becomes, then, if I spend 5 years learning and mastering C++ versus rust, which one is going to help me produce a better product in the end?
Structs with union members that allow the same place in memory to be accessed either word-wise, byte-wise, or even bit-wise are a god-sent for everyone who needs to access IO-spaces, and I’m happy my C-compiler lets me do it.
#pragma push
Context?
You use it to “pack” bitfields, bytes etc together in structs/classes (wo functions), otherwise the computer usually align every variable on a 32bit boundary for speed.
You don’t need that pragma to pack bitfields.
With say a 3bit int, then a 2bit int and various char, int etc and so on you did have to use the pragma with gcc & visual around 2012 at least
OK, I use the Keil ARM compiler, and never needed to push anything.
Then I’d make a unit test, there is no requirement to do so by the compiler (not even the order).
C++: all the footguns you need plus a lot more that you never imagined in a single language