I have an array of color codes, the size of which is known at compile time. I want to declare another array of the same size. But the code below throws an error.
I can, of course, declare the size as a global constant and then use that in the declaration of both arrays. But I don't want to keep adjusting the size constant when I add new colors. Is there a way to do this? (The variables are global.)
static const char *colors[] = {"#0000ff",
"#00ff00",
"#ff0000",
"#ffff00",
"#ff00ff",
"#00ffff",
"#ffffff",
"#000000",
"#ff8040",
"#c0c0c0",
"#808080",
"#804000"};
static const int NUM_COLORS = sizeof(colors) / sizeof(colors[0]);
static ColorButtons color_buttons[NUM_COLORS];
#define NUM_COLORS (sizeof colors / sizeof colors[0])static ColorButtons color_buttons[sizeof(colors) / sizeof(colors[0])];; then use WeatherVanes approach. Otherwise I'd try X-macros (or better the undef-free variant).NUM_COLORSmust be known at compile-time. You can use macros instead:#define ARRAY_SIZE(X) (sizeof X / sizeof X[0]).color_buttons[sizeof(colors) / sizeof(colors[0])]will work. Even if it's astatic const, NUM_COLORS is still an integer variable, not a true compile-time "constant" that can be relied upon to allocate memory. You will get either an explicit storage size of ‘color_buttons’ isn’t constant complaint, or a refusal to declare a variably sized array in that context.