You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The time complexity of the provided function becomes:
Worst Case: O(k * n * m)
Best Case: O(k * m)
In my opinion, optimizing the data masking process is essential. In terms of computational complexity, the worst-case scenario is O(k * n * m), where k is the number of lines, n is the number of compiled regexes in self.mapping.compiled_regexes, and m is the average length of the lines. Conversely, the best case is O(k * m).
I've been mulling over a few questions.
Should we consider period>\ic cleanup of mapping files as the number of compiled regexes increases over time, potentially impacting total processing time?
Yes, it is worth raising a separate issue for the periodic cleanup of mapping files.
My response was if automating a removal of map regexes for obfuscation it seemed that tracking the temperature (ie.e most recent time a regex map obfuscation was used) may be required. I also thought that it should be left to the user when to enable this removal of past regex maps. However, a consistent hash and stored salt mentioned by @pmoravec might make the automated cleanup less of a concern. At least if I followed everything properly the regex map for the same sensitive data being created would likely result in the same obfuscation map entry, given the same hash method and stored salt.
The text was updated successfully, but these errors were encountered:
_Originally posted by @NikhilKakade-1 in: #3097 (comment)
Yes, it is worth raising a separate issue for the periodic cleanup of mapping files.
Originally posted by @pmoravec in #3097 (comment)
My response was if automating a removal of map regexes for obfuscation it seemed that tracking the temperature (ie.e most recent time a regex map obfuscation was used) may be required. I also thought that it should be left to the user when to enable this removal of past regex maps. However, a consistent hash and stored salt mentioned by @pmoravec might make the automated cleanup less of a concern. At least if I followed everything properly the regex map for the same sensitive data being created would likely result in the same obfuscation map entry, given the same hash method and stored salt.
The text was updated successfully, but these errors were encountered: