|
|
“Nothing gold can stay” said the poet almost a whole century ago. Go figure.
Even if this means that Frost would have been insulted by the longevity his poem has achieved, obviously condemning itself by the standard it describes, he’d have to agree that just writing or publishing a poem isn’t worth quite as much as having it remain in people’s minds.
The same goes for links – as a lot of SEOs are too eager to forget – just creating a link won’t do you much good if that link is removed soon after, or if the site holding it goes to the dogs in terms of quality.
That’s why the links you’ve created (or hired someone to create) will only keep working for you if you: A) Make sure you are only reaching out to relevant and reputable websites; and B) Keep periodically coming back to those links and checking if they have been removed, or if the site linking to you has changed so much that you would prefer them removed.
If you are doing SEO for yourself or for just a couple of clients, you can do both manually. However, inbound marketing agencies with dozens or hundreds of clients couldn’t even dream of spending this much time on prospect assessment or backlink revisions, and usually have to look for SEO tools that can help them do this – or, like in our case, develop those tools from scratch.
So, regardless of whether you are a one man SEO agency; a huge comprehensive inbound services provider; or a business owner who either wants to take care of your own SEO or simply monitor the performance of someone you’ve hired to take care of it for you; here’s how to keep your backlink portfolio pristine, with or without help from specialized tools.
The number of conditions that a site needs to meet in order to qualify as an interesting backlink opportunity can seem daunting. Especially if you consider the fact that, while you might benefit enormously from a link on a particular site, not everyone would get the same value from it. That’s why manual prospecting is not always about determining which sites meet all the required conditions, but instead, often turns into scanning the site you’re analyzing for unforgivable faults, and once you find any, removing it from the list of potential prospects. After all, it’s much easier to determine if a site has one negative feature than it is to verify that it has a bunch of the ones recommending it. This process of elimination begins the second you start your search.
Once you enter your query, you can whittle down the offered results to only the most pertinent ones. So, before you even begin looking at specific results, you can eliminate the ones you know you don’t want in a number of ways:
&num=100 search parameter to the end of your search URL string. 2011..2013, or through the more convoluted Julian date format reliant datarange:), or parameters (as_qdr=x, replacing “x” with “d” for Day, “w” for Week, “m” for Month and “y” for Yeah, we get it!), whichever way is more convenient for you.-site: operator to your query, followed either by the specific site you don’t want to see in the SERP (-site:facebook.com) or an infamous TLD (-site:.info).
Once you have a final set of results, it’s time to take a look at each of them individually. You can do this from the SERP itself, or you can export them to a spreadsheet and do it from there (Scraper Chrome extension does a wonderful job of this, allowing you to not only copy results URLs but also page title and description elements). For the sake of simplicity, let’s assume you’ll be looking directly at the SERP. So, what is it that you can tell about a website, without even clicking on a result?
site: operator, you might consider skipping the site. It takes some experience to reliably decide how many (total characters, numbers or hyphens) is “too many”, and even then, you might miss out on a great prospect just because the webmaster decided to get creative with their domain name, but in the long run, the time you save will more than make up for the wasted opportunities. Likewise, if the root domain contains words like article directory, submit, bookmarks, infographics, etc, they might just be remnants from the Dark Ages of SEO and should be either skipped or at least, scrutinized with extra caution. Once you’re done with the browsing section, it’s time to do some serious clicking. Now, we are aware that a lot of results will lead to deep pages, but let’s assume that your first direct contact with a site will be made through its:
You’ve read the advert, and now you’ve bought the car, so how does it feel? Did you get what you expected when you clicked on the result in the SERP? If not, you’ve experienced the same kind of user frustration that others visiting the site may feel, and this might be a red flag on its own, but one that might be purely context-dependent, and that shouldn’t cause you too much worry. However, site homepages are exactly where you’ll usually find the most blatant and unforgivable SEO transgressions, common sense violations, and linking ethicacy faux-pas. Here’s how to identify them.
If you’ve gone through these points and the site still didn’t disqualify itself as a prospect, you’re almost there, but not before you examine another one of its critical segments.
Ideally, beautiful, informative, heartfelt, accurate and engaging content would be everything you need to reach people. But, with Google ignoring you or misinterpreting your intentions if you’ve failed to mention the relevant keywords often enough; and readers giving you a pass if you don’t have plenty of images, minuscule paragraphs, and enough shiny things to keep their attention by slightly distracting them; just enjoying someone’s content is, sadly, not nearly enough for you to try and get a link from them. Here are other conditions they have to meet.
If you’ve ever read anything about content before, chances are you’ve seen people attributing it with royal origin. And what is true royalty if not honest, direct, attention-grabbing and eloquent – but not to the point where the people can no longer understand its proclamations? If that’s the kind of content the site you’re considering is publishing, you just might want to go ahead and try to get in their good graces.

All of them. The ones leading from the site, and the ones coming to it. Are they linking out to suspicious sites, which would place you in a bad neighborhood, or have they resorted to dubious link building tactics in the past, leaving their backlink portfolio as sketchy as a drawing of a stick figure. Have they been getting or handing out links with suspicious, unnatural anchors, or have they gone so far to actually try to cloak outgoing links by hiding them from visitors? Are their total backlinks to unique referring domains ratio skewed too heavily in favor of the former? Are you likely to get a dofollow or a nofollow link from them? If you really want to get into it, this part of the analysis is usually the most time-consuming one; and while you can allow yourself to only perform a slightly more casual research of this aspect of a site, you must never disregard it completely if you are hoping for any kind of success with your link building.
There you go. Now do this for each and every prospect you come across, and in just a couple of short hours, you may have yourself a respectable list of some 10 to 20 decent prospects. Ok, this might be a bit of an exaggeration, you’ll probably do a bit better than that even with completely manual prospecting, but now have a look at how it all works when you have a specialized link prospecting tool at your disposal, in this case, our own Dibz.
People are often wary of automation because they feel like it precludes subtlety. And while some processes cannot be automated without making them clunkier or less sophisticated, there are those that can. Remember how in the manual prospecting section we mentioned that finding the right result is done by eliminating all the rest? This kind of approach is perfect for automation, as it allows you to only eliminate the sites that you are certain are undesirable. So, by raising or lowering your standards, you’ll have fewer or more websites to go through and directly evaluate, but if you are cautious, you’ll never have to worry about missing a potentially interesting prospect. So, here’s how the described process of elimination works with Dibz.
Remember all those vertical searches, parameters, operators and social media info that you had to think about? Most of this will be taken care of by the tool, you just need to set the guidelines. The first part of this is done through the spam filter page.

This page allows you to set your preferred standards when it comes to 17 separate ranking factors (apart from the ones you can see in the image, this also includes stuff like domain name length; presence of cloaked links, social profiles or sponsored content; number of external homepage links; total number of site’s indexed pages etc.). You can attribute a spam value to each entry, depending on how important you believe it be. So, for instance, you don’t like websites which have too many ad blocks. Dibz gives you a chance to decide how many is too many (in this case, 5) and to determine how many “spam points” (36, in our case) will be added to the site’s total if it has more than that. You can specify these values for each of the 17 factors, and then decide on how many points can a site have before it is removed from the list of filtered results in Dibz (you can still find these sites in the ‘all results’ tab). So, even before you begin your search, you’ve taken care of 17 separate considerations.
The same page also allows you to provide list of preferred TLDs, so you can request them from the search page; to define research type templates; and to compose a list of websites that you never want to see return as results (for instance social networks, which otherwise thickly populate results pages, and are not interesting as link building prospects).
Once you’ve done this, it’s time to move on to search proper. Here’s what this looks like in Dibz.

While the right-hand side provides great time-savers on its own – preferred language for the results, desired TLD and date range; the truly important part is on the left, where you get to interchanging the actual values. basic search terms, as well as specific modifiers that you want to combine with those terms.
What does this do that Google can’t? Nothing, but, it does some things that Google won’t:
The convenience of different filters and clearly displayed essential metrics aside; what we are most proud of, and what is probably the main benefit of Dibz is the ability to perform a string of operator-modified searches in succession, which is something that you couldn’t do in Google without having to complete a new captcha every couple of minutes. It’s difficult to believe how much times this saves you until you try it.
The sites in the list you are left with satisfy all the machine-measurable standards mentioned in the manual prospecting section, and since spam filter needs to be set only once, all you had to do to spare yourself the trouble of going through countless sub-par sites was to enter you search terms and custom operators.
You still have to visit each of the sites, decide if they are relevant enough, and generally, if they are someone you’d want to deal with. However, since a huge portion of unsuitable websites has been removed by our utility, you’ll get a much higher percentage of valid prospects than you would with even the most refined Google search.
So you’ve made a beautiful link, now you just have to make a record of it. You fire up your trusty spreadsheet of choice, copy the URL of the page linking to you; perhaps make a column for the root domain, purely for organizational purposes; make a note of anchor text and target; whether the link is a dofollow or nofollow; contact details of the person you’ve negotiated the link with; date the link went live; if you’re a part of an agency, note the name of the link builder responsible for the link; name of the client the link was made for; a brief note about the link; and anything else you might need.
Now, if you are only doing this for yourself, or a couple of clients, creating this kind of a list, and retrieving info from it is fairly manageable; but anyone with more than two or three clients and employees creating links for them; and this quickly turns into an organizational nightmare.
Even if you simply had to record all the links and never look at them again, doing it manually is a hassle. However, you are not etching them in stone for future archaeologists to find, you will have to keep coming back to them, assessing their suitability for other clients, taking stock of used anchors and targets to prevent over-optimization and ensure appropriate diversity, and what’s most important, checking their health, i.e. if they are still live, dofollow, and if the site holding them is still acceptable.
Now, this is not even going into monitoring the actual benefits you are getting from the link, like the traffic it is sending your way, which you’d had to turn to Google Analytics and Console for. So, just to check if your links are still there, you’d have to visit each of the sites from your sheet, check your link’s target and anchor, and at least get the site’s DR. Doesn’t sound too bad? Not even when you realize that you’d have to do this for every link you’ve ever made, and then repeat the entire process at least once a month? We thought so.
Naturally, there are agencies that believe their job is done the moment a link is live, but they and by extension, their customers, are thusly deprived of a complete overview of their backlink portfolio, and the ability to draft appropriate future strategies. Basically, you can’t plan a trip if you don’t know your starting point.
So, how do you eliminate monthly spreadsheet juggling, and carpal tunnel worth of clicking?
As our agency grew, so did the number of our clients and employees. Synchronising efforts of dozens of link builders, some of whom were sharing clients and link contacts, turned out ot be an incredible, time-consuming chore, one that opened countless opportunities for human error and miscommunications. Do we already have a link for a client on a particular site? Did we outreach a site for a client, and how did it go if we did? How many links were created with a specific anchor in a specific period? And finally, what’s the current status of those links, and the sites they’re on?
After listing each and every issue we encountered with manual approach, we set out to create a tool that would eliminate, or at least alleviate all of those problems. That’s when Base.me was born. Its main purpose being facilitating link data entry and retrieval, it became a self regulating system which organizes the workflow, performs scheduled or on-demand health checks of our links, and basically solved most of our issues with link building management and monitoring. However, since this utility is still being jealousy kept for internal use and for a handful of our friends and partners, the only way for you to see what this looks like in practice is to register your interest and perhaps qualify for beta access.
Until that happens, or we make the tool available to everyone interested, how do you replicate some of the features it offers, and cover other essential areas of link monitoring it was never meant to address?
It’s simple, you turn to our latest utility, a beautifully versatile digital marketing KPIs dashboard reporting tool, Reportz. We decided to develop this tool as soon as we took exact stock of how much time we were wasting on manual reporting. While we have been working with a more modest number of clients, patching reports together by copying data from different sources we were using to track campaign performance and trying to organize it so that clients have no problem understanding them, wasn’t too unbearable, even though it could take up to several hours per client. However, when you consider the fact that reports had to be created, or at least checked by people in middle to upper management, whose time is too precious to be spent on hours of copy/paste/format/repeat; and when you also account for a steady, fast-paced growth of our client base, the urgency of finding a way to automate as much of the process as possible couldn’t be ignored for too long.
We solved this by creating a tool which can be connected to the data sources you usually rely on for your KPIs; extract the data you specified through one of our offered templates, or your own custom setup; organize, contrast and compare that data in a way that a particular client finds optimal; and which can either send scheduled reports without you needing to move a muscle or simply be made constantly available to clients, who could check up on the exact, current state of their KPI’s.
While it was initially conceived as an SEO agency tool, the fact that it allows for convenient tracking of metrics from a single dashboard makes it just as suitable for business owners who want to keep a watchful eye over the performance of their digital marketing campaigns, whether they are run by someone in-house, or by an outside agency.
This is what the process looks like in practice, for agencies and for DIY SEOs alike – if you want to follow along, Reportz has a free trial, which we encourage you to make use of.

If you’re working in an agency, you’ll probably be able to figure out how to do all of this on your own, but if you can’t, or if you are not a professional SEO, but just want to monitor a campaign someone else is running for you, you can contact us for a live demo, where we’ll guide you through each step of the process.
So, let’s say that you know how to do all of this, which metrics should you take a look at? While you can connect your Base account (if you have been granted access), and monitor your links from a Reportz dashboard, this would be missing an opportunity to do so much more. Namely, Reportz gives you a way to extract and organize all the metrics you need to track the efficiency of individual links, and your entire link-building strategy.
Ahrefs – New and lost domains
If you want to be alerted of being linked to from a new domain, or losing links from a domain, having a widget with Ahrefs ref domains data is not just convenient, but essential. This allows you to promptly intervene if you want the removed links to be restored, or if you realize that domains you would rather not be associated with are linking to you.
Rank Ranger – Page Rankings
Not much to say here – by setting custom dates in Reportz, you can observe the rankings of your target pages in the period you were building links for them, and get another piece of the puzzle.
Google Console – Avg position (pages)
Similar to what you’re getting from Rank Ranger, but coming from your Console, and showing average page position in the selected period, along with the position change. The same logic applies – if you see that a page was dropping in rankings while and soon after you created a bunch of links for it, you might want to go back and give those links another look. Likewise, if you see a page has skyrocketed in rankings, and you don’t have anything else to attribute it to, examining the links created in that period might reveal some sites that you might perhaps want to contact again in the future.
Google Console – Top Clicks (queries and pages)
All the SEO talk can make people forget that links are there for the traffic, this is still their purpose and their greatest value. So it makes sense if you want to know which of your pages people like coming to, and which anchors seem to be doing a good job of directing them there. Crucial for your overall SEO strategy, when observed in appropriate periods, it also shows which of the links you’ve created are actually doing their job instead of just showing up for it.
Google Analytics – Top keywords and landing pages from organic
Pretty much the same as above, but from a slightly different angle. Might show you data that the previous combination didn’t, so it’s definitely worth your time to add these two widgets as well.
Google Analytics – Organic visits rate
Shows fluctuations in the number of organic visits you were getting in the observed period. Again, correlating your rankings and your link building efforts at a certain time is never completely straightforward, but when you account for algorithm updates, the effect of your other inbound channels, etc. you can get genuinely valuable insights.
Well, yes. You abstain from creating suspicious links and stay vigilant when it comes to removing those created without your intervention, and you should be fine. As long as the number of these links is relatively low, you can manage without specialized tools, but for larger volumes, even if you are not an SEO agency, but a business owner who wants to keep an eye on the way their site is being promoted, do yourself a favour and give our utilities a go.

Categories: None
The words you entered did not match the given text. Please try again.
Oops!
Oops, you forgot something.