I'm working on an embedded system that requires some pre-determined data to be stored in memory for use at run-time. The data is essentially a large state transition table.
example: {1,2},{1,3},{2,3},{2,4},{2,5},{3,1}
Where: State 1 can go to state 2 or 3. State 2 can go to states 3, 4, or 5. State 3 only goes to state 1.
I'm currently using an "N" x 2 array - Where "N" is the maximum amount of transitions (in this case, 3).
This isn't space efficient, there is a lot of unused memory. From the example above, we'll have 2 unused allocations for State 1, and 4 unused allocations for State 3. The problem gets much more pronounced if one state has a lot more transitions then the rest.
My current implementation stores information as an array of structs, where the struct is of the form:
#define max_transitions 9
...
const struct state_table{
uint16_t number;
char len;
uint16_t stt[max_transitions][2];
};
I then go on to define a pile of data:
#define MAX_STATES 619
const static struct state_table SUPERVISOR[MAX_STATES] = {
{1,6,{{301,2},{410,3},{411,4},{500,5},{501,6},{604,7}}},
...
{619,5,{{301,611},{401,297},{500,619},{501,619},{602,514}}}
};
First element is the current state, second is the length, third is an array for the STT.
Within the 619 states, there's about 10 states that have 9 transitions. The average transition length is 5, so this creates a huge memory waste with my current implementation.
Question: I'm looking for guidance on how I can make this more space efficient.
Thanks in advance,
N? 10: 100 619 1000 million?