Click tracking to exclude bogus clicks


My project refers users to external news sites via hyperlinks.

I want to track the most popular news item over a period of time and highlight this in different ways.

I initially thought I would refer the member to external.php?ID= where I would add a record to a database, capture the News ID, Time, IP perhaps etc.

I can remove double clicks by doing a time check and ignoring anything within 30 minutes or anything within the same session but how can I remove those dam bots?

Or is there a better way of doing this?


Well, you can turn off legitimate robots using a “robots.txt” file.
This will not turn off hacker robots as they do not care about your setup…

There is a lot of info on this file online. Here is one I just found quickly. robots.txt

But, to only count your own users, have all your links go to an external page that counts them and then redirects to the live site after checking the login info stored in the session data. In this way, the only ones that get counted are valid clicks form logged in members. Or, if your site does not include logging in, you can check the calling page in your script and make sure it is coming from your pages and not others.

Just some ideas to start with…