Recommendations

At this stage we want to give you some useful tips:

1.Do not attempt to cover his search the entire Internet - it does not improve the results, but only give rise to absolute aimlessly spent traffic.

2.Be sure to read the article about the Deep scan limitation. Dive to a depth of 5 references from the initial document is often able to cover the entire site.

3.Using the authorization mechanisms should be remembered that the sites of its kind in 99.9% of cases are protected from excessive network activity. And if you do not want to lose your account on this site, configure the filter anti-flooding in advance, before the start of the project, not after the admin of this site will block your account.
Just imagine for yourself how it looks from the owner of the site - a user is browsing simultaneously on 10 pages at a rate of 5 pages per second... Of course, it's a bot!

4.If you do not want to see the words "most likely a virus on your computer ..." when trying to find something in the search engine Google, then turn off the direct scanning of search engines - you will not find a way there is nothing of what you're looking for, but will get one hundred percent ban.

5.Using a dynamic replacement filters can significantly increase the number of results found, but not get carried away - it greatly increases the load on the processor and memory.