Tracking PPC Campaigns via

This is a post meant for my clients but if you use Google Analytics or ‘GA’ you could benefit from this. I will save you a little reading, never use GA or Facebook tracking on your website, app or any digital platform. Moreover never take any advertisers own statistics as facts. A newspapers circulation number is a good example. Digital ads claim to be 100% accurate, this is very untrue.

I advise all my clients not to rely on Google or Facebook or any advertiser for metrics of any kind and I am especially wary of the numbers advert platforms like facebook Ads and AdWords generate. Google and Facebook both have less than perfect reputations and they seem to massively round up data. I wont add anything libelous here, for that you can speak to me in person :) I have never seen accurate numbers from either platform. I have been working with PPC campaigns long before Google were even a company. Much as today in 2022 they have acquired all facets of this industry, there was one before Google and will be after them.

Quick Links

The post is broken up into 3 sections, you can link to them within the doc below:

  1. What are tracking Variables & Landing Pages
  2. Setting up to track AdWords or any Ad
  3. Setting up AdWords, or any Paid link exchange to be tracked

In this post we will look at adding tracking variables in AdWords and also then listening for this on clicky. Please have a paper and pencil at the ready ;)

1. Tracking Variables: What are they?

Below is the format I would advise, below is ‘saying’ Advert is AdWords, Campaign #1.

You can test this idea here:

When you click this link, the regular home page will open, but I now can be 100% sure you clicked this link in this post. The logic here is to use a variable that is faux or not real. Were you to replace ‘ad’ with ‘s’ you would trigger WP’s in built search:

You can test this idea here:

Landing Pages

This is a term that really gets misused. A landing page is a PPC ‘thing’ its a web page, orphaned from a site, that is used to gate advertising content.

By gate I mean this is a page you track via and no other traffic is sent there, it is also hidden from search and de-indexed if it does happen to index.

You can test this idea here:

Stopping Pages Indexing

Google will use your ‘robots.txt’ file, kind of. If this is the first time reading about the robots text directive and file, check out this file here:

Imagine this robot as your virtual bouncer, he tells folks not to come in. Hes Marvin the android, guarding your car park 🤖

A ‘please don’t index me Google’ check list:

  • Set the meta tags on the page via the SEO plugin, most clients will be using de-bloated Yoast if one of my clients, but any SEO setup will let you set the meta tags if Google crawls.
    • Within all the sites I build, you can find this drop down menu within the Yoast SEO plugin settings, and set it to ‘No’ all this does is add a ‘noindex’ meta tag. Please note this is to be done on the landing page only, and does not mean google will adhere to it:
    • If you manually manage that (fair play) you can add those in whatever tool you use to add tags into the header of your HTML.
  • Tell Marvin this page is barred from crawl via a disallow: /landing-page/ in the robots.txt, this is doable very easily via the CMS, if you have a SiteGround Security plugin it will stop this, but I can help any folks with longer term URLs for PPC, it would not be very common to keep this page live for very long. But if you did this would be part of your mix. Landing pages could also be in a sub folder, which is blocked from Google. Whether a page or a folder, Marvin’s list is easy to add to:

    • If you are a code warrior, I salute you, but this should be simple for you to keep on top of, the Robots file is a simple .txt/’text’ file type you can update via FTP, in WordPress no such file exists per say, as its CMS generated and also via plugins. Anyone on a CMS using FTP has an issue.

Removing Indexed Pages

Google will index pages set to noindex in meta and it will barge past Marvin/robots.txt directives all the time. It is unclear why, but it happens. There is a removals tool in GSC, this can then be used.

This tool is simple and you can see its interface below:

You can see here a request to remove /portfolio_page/ this is a custom post type which powers the work section of this site, and the feeds on other pages, Google was told via Robots and meta not to index there pages but did. This is a live example of why this tool is required generally and also is very important for PPC.


Ok so all this might seem super technical for a paid ads, “These things are easy” right? But this is to protect the gating idea. If a page indexes it can get traffic, if it gets other traffic we cant effectively see if the ads are driving leads or if this page/post has leaked into the internet. Its a private walled garden only those lucky enough to see the adverts go. But with the initial tagging idea, this cannot happen as you dont have a landing page. The only place this tag could be added is via the ad. Or a malicious person replicating it, this is also true of all web stats.

2. Setting up Tracking Variables

Clicky Goals are super simple, also they require no additional code in conversion pages. For background GA and many older tools, use this idea of code on a page, Clicky listens for this page, so its one page, and below shows the setup:


  1. Login, choose the correct account
  2. Head to Goals, then Setup
    1. Name the Goal, I also name my campaigns in the Advertisers dashboard in the same way
    2. Add the URL, note / is the root of your domain, so / implies you can construct more complex versions of this using wild card operators such as *
    3. Choose an Icon, this may seem optional but is really advised, this helps you visually spot these leads in Spy and other advanced areas of Clicky in real time
    4. You can also via Clicky get excellent real advice via the ‘?’ icons, Google’s advice is much like a Casino, Clicky has no advertising product, this is a paid tracker, they are focused on accuracy and are platform agnostic
    5. Update
  3. Test this via the tracking URL.


  1. Do not reuse tracking tags while ads are running, if you always use one ad unit/campaign then this can be left as is and the Google side of things changed, or you can have multiple IDs
  2. Update, from this point on this data is available, if you setup ads first and forget the tracking you can find this other ways, but ideally you setup this first then the ad goes live
  3. Test this, remember all admin users will correctly but excluded from the sites traffic. So use a browser you don’t normally use like IE or Chrome. I advise FireFox to all working people online, it is the best browser for this kind of task. Chrome or IE/Edge are 100% not advised as working tools, but should be used only to test.

This is one part of the process, we will now make sure our Ad units are tagged in the same way.

UTM Codes and Clicky

It supports them but the idea is madness. Google has this UTM or Universal Tag Manager approach where this concept it very over done, heaps of variables are added all of which are unrelated and not defined as unit numbers. This allows Google use this logic but dilute it under the guise of auto tagging. Some sites use this on buttons, others on ads. It is a terrible setup. But if your agency partners use it or you take over PPC done wrong. Clicky will automatically work. By this it will detect all UTM codes added and arrange them for you in reports. I would not reply on this or any Google tech. All end up here: over a long enough timeline. Also you are being trained by the Advertiser, not learning the industry. UTMs work with Google mostly. The unbranded method (in this post) works with HTML tracking in any medium.

3. Setting up AdWords, or any Paid link exchange to be tracked

Before we discuss this, lets explore the concept of a Trojan horse. Whether the Trojans built a massive wooden Donkey or not all tracking tags are a type of Trojan, some ethical some not. By this they give your data and your clients data to someone, an agent within the software is taking data offsite to another source. Google and Facebook insist that you need to ‘tag’ your site to run any sort of advertising. You might read hundreds of articles on this. For context I have built my own ad serving network, a small one based on Flash and PHP, but most PPC folks have little to no idea of what they are discussing in this area. All while security experts and now humanitarian groups are screaming about the impact to society and the next generation of this idea let alone its implementation from these huge tech giants.

Google for example offers what is termed sampled data, Facebook is now actively blocked by all users of iOS and still claims its data is rock solid. You can read past the first page on Google on either brand any week and find some anti-trust or data leak scandal. Recently very brave folks have stood up to whistle blow, this is an endemic issue in this industry.

The fish rots from the head down, most SEM experts of the last decade parrot Google and Facebook’s rhetoric on accuracy of stats, all while adding obtrusive tags to help the advertiser track us all. Even if this stuff worked I would be against it, but from a purely data science and marketing point of view none of these companies are audited or allow anyone to validate the data they produce. It is black box, sampled and most certainly skews towards positive/upward rounded numbers.

Recommendations in Ad Serving Platforms are Up-Sells or asks to trade data for ‘results’

Your ad is setup when it is live and when you can see traffic in Clicky for it. This will happen pretty slowly relative to the spend. But you will then be told your not finished yet. This is the phase to ignore, anything asking you to link AdWords or add code to your site, this is stranger danger online. Any personasking to track your site is like a person leaving a CCTV in your lobby or staff room, its really odd and wont have any impact on ads.

You will see ‘Conversion Tracking’ or a variant of this buzz word as the first ‘recommendation’ Google and Facebook push and oh boy will they push it. Why? You have to ask why is this advertiser so keen to help me track my ‘conversions’ when they sell clicks? Confusing isn’t is? But it is really simple, if every website is tagged this way they have a network, of sites all tracked. As you move from site to site and then back to Google/Facebook eureka! they have you at every turn. If you then use Google Chrome or stay logged into Facebook/Meta apps you are further tracked.

This is then used to tell you how to buy more ads, all while mixing your data into the huge reservoir of servers and partners, selling/trading and warehousing this data forever.

Adding Your Tracking Tagged Links

This is the simplest part of adding your ad unit, so whether a smart ad, a text add, a banner or even a direct link exchange or simpler trafficked banner ad on a site. This tag/tagging logic will work.

Below is a Google example:

Once you have your tagged link setup in Clicky you can then add it here or in any Ad Unit or Ad exchange. This idea will work with any ‘href’ based link, which is any link in HTML.

Please note the example here is to illustrate. Your link would be added here. And then you are done. Keep an eye on the numbers Google reports vs the real clicks, the ones that actually make it to your site. Also you can review what is termed Click Fraud, I could write a book on that topic. But its one to understand before you see the numbers Google or any advertiser ‘reports’.

The rule of thumb with all ads is to be targeted and where possible track all you can. But the way the ads are trafficked has now become a smoke screen for what they do. If you are my client you can talk to me any time about this, if you would like to review more of why I use clicky please see the link below:

or if you are thinking about buying Clicky:

That has my affiliate ID and is another great example of tracking via links, as discussed here. Full disclosure too the link is affiliated but I am not connected with this software in any way, other than being an avid user of it now for over almost a decade. All screenshots in this post are used under fair usage as this is an educational piece for my clients.