The JSON specification requires that binary data be encoded as a string. Base64 is the ubiquitous solution, but it introduces a ~33% size overhead. In contexts where bandwidth or payload size is critical (e.g., high-frequency APIs, mobile applications, or cloud storage specifications like CDMI), this overhead is significant.
Given the constraints of the JSON string format, what standardized, text-safe encoding schemes are more compact than Base64?
I am specifically looking for a comparison of standardized methods (e.g., formally defined in an RFC, IETF draft, or major standard body) based on:
Encoding Efficiency: The resulting string length compared to the original binary data.
JSON Compatibility: Assurance that the encoded output is a valid JSON string without requiring additional escaping.
Processing Overhead: Computational complexity relative to Base64.
For example, I am aware of encodings like Base85 (Ascii85) and Base91, which claim better density. However, I am seeking answers that detail:
Their standardized specifications.
Quantifiable size reduction in a JSON context.
Practical trade-offs, such as character set suitability and real-world library support.
This question is not seeking opinions or library recommendations, but a factual comparison of established encoding standards suitable for embedding binary data within a JSON string.
JSON.parseetc. ......