• solrize@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    5
    ·
    2 months ago

    It’s very hard for “Safe C++” to exist when integer overflow is UB. Rust also gets it wrong, though not quite in the same way. Ada gets it right.

    • BatmanAoD@programming.dev
      link
      fedilink
      arrow-up
      9
      ·
      2 months ago

      By Ada getting it right, I assume you mean throwing an exception on any overflow? (Apparently this behavior was optional in older versions of GNAT.) Why is Ada’s preferable to Rust’s?

      In Rust, integer overflow panics by default in debug mode but wraps silently in release mode; but, optionally, you can specify wrapping, checked (panicking), or unchecked behavior for a specific operation, so that optimization level doesn’t affect the behavior. This makes sense to me; the unoptimized version is the same as Ada, and the optimized version is not UB, but you can control the behavior explicitly when necessary.

      • solrize@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        In Ada, the overflow behaviour is determined by the type signature. You can also sometimes use SPARK to statically guarantee the absence of overflow in a program. In Rust, as I understand it, you can control the overflow behaviour of a particular arithmetic operation by wrapping a function or macro call around it, but that is ugly and too easy to omit.

        For ordinary integers, an arithmetic overflow is similar to an OOB array reference and should be trapped, though you might sometimes choose to disable the trap for better performance, similar to how you might disable an array subscript OOB check. Wraparound for ordinary integers is simply incorrect. You might want it for modular arithmetic and that is fine, but in Ada you get that by specifying it in the type declaration. Also in Ada, you can specify the min and max bounds, or the modulus in the case of modular arithmetic. For example, you could have a “day of week as integer” ranging from 1 to 7, that traps on overflow.

        GNAT imho made an error of judgment by disabling the overflow check by default, but at least you can turn it back on.

        The RISC-V architecture designers made a harder to fix error by making everything wraparound, with no flags or traps to catch unintentional overflow, so you have to generate extra code for every arithmetic op.

        • BatmanAoD@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          It sounds like you’re talking about dependent typing, then, at least for integers? That’s certainly a feature Rust lacks that seems like it would be nice, though I understand it’s quite complicated to implement and would probably make Rust compile times much slower.

          For ordinary integers, an arithmetic overflow is similar to an OOB array reference and should be trapped, though you might sometimes choose to disable the trap for better performance, similar to how you might disable an array subscript OOB check.

          That’s exactly what I described above. By default, trapping on overflow/underflow is enabled for debug builds and disabled for release builds. As I said, I think this is a sensible behavior. But in addition to per-operation explicit handling, you can explicitly turn global trapping behavior trapping on or off in your build profile, though.

          • solrize@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            In Ada? No dependent types, you just declare how to handle overflow, like declaring int16 vs int32 or similar. Dependent types means something entirely different and they are checked at compile time. SPARK uses something more like Hoare logic. Regular Ada uses runtime checks.

            • BatmanAoD@programming.dev
              link
              fedilink
              arrow-up
              1
              ·
              2 months ago

              Whatever you want to call them, my point is that most languages, including Rust, don’t have a way to define new integer types that are constrained by user-provided bounds.

              Dependent types, as far as I’m aware, aren’t defined in terms of “compile time” versus “run time”; they’re just types that depend on a value. It seems to me that constraining an integer type to a specific range of values is a clear example of that, but I’m not a type theory expert.

              • solrize@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                2 months ago

                Dependent types only make sense in the context of static typing, i.e. compile time. In a dependently typed language, if you have a term with type {1,2,3,4,5,6,7} and the program typechecks at compile time, you are guaranteed that there is no execution path through which that term takes on a value outside that set. You may need to supply a complicated proof to help the compiler.

                In Ada you can define an integer type of range 1…7 and it is no big deal. There is no static guarantee like dependent types would give you. Instead, the runtime throws an exception if an out-of-range number gets sent there. It’s simply a matter of the compiler generating extra code to do these checks.

                There is a separate Ada-related tool called SPARK that can let you statically guarantee that the value stays in range. The verification method doesn’t involve dependent types and you’d use the tool somewhat differently, but the end result is similar.

                • BatmanAoD@programming.dev
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  2 months ago

                  For what it’s worth, Ada and Spark are listed separately in the Wiki article on dependent typing. Again, though, I’m not a language expert.

                  • solrize@lemmy.world
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    edit-2
                    2 months ago

                    I’ll look at the wiki article again but I can pretty much promise that Ada doesn’t have dependent types. They are very much a bleeding edge language feature (Haskell will get them soon, so I will try using them then) and Ada is quite an old fashioned language, derived from Pascal. SPARK is basically an extra-safe subset of Ada with various features disabled, that is also designed to work with some verification tools to prove properties of programs. My understanding is that the proof methods don’t involve dependent types, but maybe in some sense they do.

                    Dependent types require the type system to literally be Turing-complete, so you can have a type like “prime number” and prove number-theoretic properties of functions that operate on them. Apparently that is unintentionally possible to do with C++ template metaprogramming, so C++ is listed in the article, but actually trying to use C++ that way is totally insane and impractical.

                    I remember looking at the wiki article on dependent types a few years ago and finding it pretty bad. I’ve been wanting to read “The Little Typer” (thelittletyper.com) which is supposed to be a good intro. I’ve also played with Agda a little bit, but not used it for real.

    • lysdexic@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      2 months ago

      It’s very hard for “Safe C++” to exist when integer overflow is UB.

      You could simply state you did not read the article and decided to comment out of ignorance.

      If you spent one minute skimming through the article, you would have stumbled upon the section on undefined behavior. Instead, you opted to post ignorant drivel.