This toolbar is based on the LRT Power*Trust metric that we’ve been using to identify spammy and great links in LinkResearchTools and Link Detox since 2012 and the free browser was just recently launched. It helps you promptly evaluate the power and trustworthiness of a website or page during your web-browsing way exacter than Google PageRank ever did.
The most basic and straightforward way is to use Google Analytics. And I'm not just talking about installing Google Analytics. I'm talking about using the UTM (short for Urchin Tracking Module, which relates to the old Urchin analytics system that Google acquired to create its tracking tool) variables that exist in Google's Analytics to ensure that you know where they're coming from.
Brian, great post as always! Question: Do you consider authority sites (industry portals) a form of “influencer marketing?” e.g. guest blogging, etc? In some niches there are not so many individuals who are influencers (outside of journalists) but there are sites that those in the industry respect. I am in the digital video space and for me one site is actually a magazine that is building a very strong digital presence. Thanks, keep up the good work!
Use the right anchor text. Using our previous example: if you wanted to internally link to the “how to make money” blog post, you can write a sentence in another blog, like “Once you have mastered [how to make money], you can enjoy as much luxury as you can dream.” In this case, the reader has a compelling case for clicking on the link because of both the anchor text (“how to make money”) and the context of the sentence. There is a clear benefit from clicking the link.
#6 Go on podcasts! In 13 years of SEO and digital marketing, I’ve never had as much bang for the buck. You go on for 20 minutes, get access to a new audience and great natural links on high dwell time sites (hosts do all the work!). Thanks for including this tip Brian, I still don’t think the SEO community has caught on to the benefits of podcast guesting campaigns for SEO and more…it’s changed my business for sure.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×