I have an array of ~5,000 unique IDs loaded from a CSV file:
Dim wb As Workbook
Dim idRng As Variant
Set wb = Workbooks.Open(Filename:=ThisWorkbook.path & "\DataSource\ID.csv")
With wb.Sheets(1)
idRng = .Range("A2:A" & .Range("A" & .Rows.Count).End(xlUp).Row).Value2
End With
wb.Close
Alongside this, I also load in ~100,000 rows of data, which contains non-unique IDs with numerous possible duplicates. My aim is to loop through the 100,000 rows and check if the corresponding rows ID is contained within the smaller array, and if so, add the rows data to a collection. Both IDs are stored as Longs. I have completed this using the below:
Dim dataRng As Variant
Set wb = Workbooks.Open(Filename:=ThisWorkbook.path & "\DataSource\data.csv")
With wb.Sheets(1)
dataRng = .Range("A2:H" & .Range("A" & .Rows.Count).End(xlUp).Row).Value2
For i = LBound(dataRng) To UBound(dataRng)
If mUtil.IsInArray(dataRng(i, 1), idRng) Then
'Add object to collection
End If
Next
End With
'mUtil
Public Function IsInArray(v As Variant, arr As Variant) As Boolean
For i = LBound(arr) To UBound(arr)
If arr(i, 1) = v Then
IsInArray = True
Exit Function
End If
Next
IsInArray = False
End Function
Despite this working, as you can imagine iterating through the 5,000 unique IDs 100,000 times can take a fair amount of time, alongside this, the larger file can end up being much bigger.
Is there a more efficient way of performing this task, with the ultimate aim to reduce the run time?