I'm building a Python Sudoku Solver to solve the classic and non-classic sudokus using the cp_model from Google's ortools library. Specifically, I'm trying to write a method that solves a Sudoku Variant in which every row/column/subgrid contains 9 unique numbers between 0 - 11. For instance, if row 1 of the sudoku has numbers [1, 2, 3, 4, 5, 6, 7, 8, 9], rows 2-9 cannot have the same numbers in any permutation. However, row 2 can have the numbers [0, 1, 2, 3, 4, 5, 6, 7, 8] since this is a distinct set of 9 numbers. Likewise, this distinction applies to all rows.
Here is the challenge: Since each row has 9 entries, for a given set of 9 numbers, there are 9! different permutations in a row for them to appear.
So, I decided to go with creating a Unique signature for each row. Get all the elements of a row, sort them out in ascending order, and create an 18-digit signature unique to a row. For example, for row 1 with elements [1, 2, 3, 4, 5, 6, 7, 8, 9], the unique signature would be 010203040506070809. So, if some other row has the same 9 numbers in some other permutation, it will also have the same signature of 010203040506070809. This signature will only be different if the row has another set of 9 numbers regardless of the 9! possible permutations. Hence if I add a constraint that all the unique signatures must be different, I should get what I want.
Unfortunately, this isn't working since the sorted() method or list sorting is not working well with ortools. So, I had to go for a crude approach and generate all the 9! signatures (I am aware that it is computationally expensive) and pick the smallest signature to be the Unique Signature using the code snippet below.
def DistinctRowColSubGridConstraints(self):
'''
Collect all rows, create a unique lexicographical signature for each row,
and ensure that all rows are unique.
'''
# Helper to encode a row as a lexicographical signature based on permutation indices
def encode_as_lexicographical_index(row):
# Create a list of indices: [0, 1, 2, ..., n-1] where n is the row length
n = len(row)
indices = list(range(n))
# Generate all permutations of the indices list
all_permutations = list(permutations(indices))
# Generate All Signatures
All_Signatures = []
for perm in all_permutations:
signature = 0
for idx in perm:
signature = signature * 100 + row[idx] # Concatenate each element to form the signature
All_Signatures.append(signature)
return All_Signatures
# Collect all rows and create lexicographical signature for each
row_signatures = []
for i in range(self.Rows):
row = [self.Cells[i][j] for j in range(self.Cols)] # Get the row
all_signatures = encode_as_lexicographical_index(row) # Get all possible signatures
RowUniqueSingature = self.Model.NewIntVar(lb=10**17, ub=10**18-1, name=f"RowUniqueSignature_{i}")
self.Model.Add(RowUniqueSingature==min(all_signatures))
row_signatures.append(RowUniqueSingature) # Add signature to list
# Ensure all rows have unique lexicographical signatures
self.model.AddAllDifferent(row_signatures)
print("Distinct row constraints added.")
Unfortunately, this is also not working since min() cannot work on Bounded Linear Expressions.
I would appreciate any help or guidance on fixing this issue. I would greatly appreciate if someone could point me to a much efficient solution. Thank you.
1, 5, 3, you want to output010305? And you need to collect all such signatures with input being 9 distinct numbers from 0-10?sorted()method to bypass all permutations, but it is not working. That's why I tried a sub-optimalmin()approach so that I can produce a unique signature for each row. And as you can see, evenmin()is throwing errors.