The encodeinvalidchars function in util/url.py in the urllib3 library 1.25.2 through 1.25.7 for Python allows a denial of service (CPU consumption) because of an inefficient algorithm. The percentencodings array contains all matches of percent encodings. It is not deduplicated. For a URL of length N, the size of percentencodings may be up to O(N). The next step (normalize existing percent-encoded bytes) also takes up to O(N) for each step, so the total time is O(N^2). If percentencodings were deduplicated, the time to compute encodeinvalid_chars would be O(kN), where k is at most 484 ((10+6*2)^2).
{ "nvd_published_at": "2020-03-06T20:15:00Z", "cwe_ids": [ "CWE-400" ], "severity": "HIGH", "github_reviewed": true, "github_reviewed_at": "2021-04-22T22:04:31Z" }