How to implement strings
The C programming language defines a string as “a contiguous sequence of characters terminated by and including the first null character”. Since the character \0 marks the end we often call this zero- or null-termination. Nevertheless, other programming languages often use different representations. What else is possible?
Appears in lists (1)
More like this (4)
Unicode programming, with examples Most programming languages evolved awkwardly during the transition from ASCII to 16-bit...
Unicode & Character Encodings in Python: A Painless Guide Get a Python-centric introduction to character encodings...