• palordrolap@fedia.io
    link
    fedilink
    arrow-up
    41
    arrow-down
    1
    ·
    2 days ago

    Something somewhere was definitely doing the conversion for you, but it could have been your editor, the compiler or something in between like a C preprocessor directive getting loaded in by your configuration.

        • mkwt@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          1 day ago

          In C and C++, the source character set is implementation defined. This means that each compiler sets its own rules about what characters are accepted. For example compilers could choose to accept ASCII or EBCDIC or Unicode, or some combination, etc.

          So the ISO standard will say that ; character is the end of statement punctuation. But it is up to the compiler to say which character(s) or code point(s) represent the ISO ;.

          The ISO standards also require compilers to define a separate execution character set to specify values that can be stored in char and used with the string library functions. The execution character set doesn’t have to be the same as the source character set.

          Edit: I should also mention that the rules for this stuff are changing a lot in ISO C23 and C++23. (Which standards I haven’t yet personally adopted.) Basically the ISO 23 standards mandate compilers to support UTF-8 source files, and they map every source character in the ISO standard to its corresponding Unicode character.

          • raspberriesareyummy@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 day ago

            Mhh today I learned. That’s wild. I would have thought that any sane person would allow only 7-bit ASCII for the source code, and forward-compatible character sets in strings (every standard iteration being allowed to add characters, but not remove them).