jonathanbean
New member
Last edited by a moderator:
And here you can see the bad side of heuristics as well: false positives! Pretty much all false positives are the results of trying to update detection criteria to cover future versions as well, on the cost that since future versions are not fixed yet, one has to broaden the algorithms, and things could go bad.Two fundamental goals in computer science are finding algorithms with provably good run times and with provably good or optimal solution quality. A heuristic is an algorithm that gives up one or both of these goals; for example, it usually finds pretty good solutions, but there is no proof the solutions could not get arbitrarily bad; or it usually runs reasonably quickly, but there is no argument that this will always be the case.
That's something we've been using nearly since the beginning. A simple example: when we're looking for a static file (lets say some minor thread that never gets updated), we know its properties: size, name, static checksum... now when we're looking for it, and we found a file with the proper name, we check the size first, since if the size does not match, we don't even have to look at the checksum, since that can't be a match any more then.In any searching problem where there are b choices at each node and a depth of d at the goal node, a naive searching algorithm would have to potentially search around bd nodes before finding a solution. Heuristics improve the efficiency of search algorithms by reducing the branching factor from b to a lower constant b', using a cutoff mechanism.