Hack #1: Hook readers in from the beginning. People have low attention spans. If you don’t have a compelling “hook” at the beginning of your blogs, people will click off in seconds. You can hook them in by teasing the benefits of the article (see the intro to this article for example!), telling a story, or stating a common problem that your audience faces.
Today, if you don't understand SEO, you're doing yourself a disservice. Discover the nuances about SEO so that you're engaging in the right type of traffic delivery strategies. You don't want to bend or break the rules. Plus, by really having an understanding of SEO, you could quite literally supercharge your results. Find a good course or audiobook about SEO and learn like the wind.
Meta tags. Meta tags still play a vital role in SEO. If you type any keyword into a search engine, you’ll see how that keyword is reflected in the title for that page. Google looks at your page title as a signal of relevance for that keyword. The same holds true for the description of that page. (Don't worry about the keyword title tag -- Google has publicly said that it doesn't pay attention to that tag, since it has been abused by webmasters and all those trying to rank for certain keywords.)

Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
The goal of SEO is to get a web page high search engine ranking. The better a web page's search engine optimization, the higher a ranking it will achieve in search result listings. (Note that SEO is not the only factor that determines search engine page ranks.) This is especially critical because most people who use search engines only look at the first page or two of the search results, so for a page to get high traffic from a search engine, it has to be listed on those first two pages, and the higher the rank, the closer a page is to the number one listing, the better.  And whatever your web page's rank is, you want your website to be listed before your competitor's websites if your business is selling products or services over the internet.

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.
Ask a marketer or business owner what they’d like most in the world, and they’ll probably tell you “more customers.” What often comes after customers on a business’ wish list? More traffic to their site. There are many ways you can increase traffic on your website, and in today’s post, we’re going to look at 25 of them, including several ways to boost site traffic for FREE.

Thank you so much for these great SEO techniques you posted on your blog. I also follow you on your youtube and listened to almost all of your videos and sometimes I re-listen just to refresh my mind. Because of your techniques, we managed to bring our website to the first pages within a month. Adding external links was something I never imagined that it would work. But it seems like it is working. Anyway, please accept my personal thank you for coming up with and sharing these techniques. I look forward to your new blog posts and youtube videos!


You understand and agree that all information, including, without limitation, text, images, audio material, video material, links, addresses, data, functionality and other materials (“Content”) that You or a third party allow, submit, post, obtain, email or transmit (or the like) to the Service (collectively, “Your Content”) is Your responsibility and not Our responsibility.
Awesome tips Brian. Always enjoy your posts. My question is, how can I boost traffic significantly if my keyword has pretty low search volume (around 100 monthly searches based on keyword planner)? I’ve been trying to expand my keyword list to include broader terms like “customer experience” but as you know that is super competitive. Do you have any suggestions for me? Thanks in advance.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Of course, we are always thinking about cost/value/likelihood we can upgrade the best content in the vertical—it is almost always the case that the low competition content, although lower benefit, also doesn’t need the same content quality the high competition terms do, so we can sometimes capture more benefit at a faster velocity by hitting those terms earlier.
Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.
Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Why? Today, we're faced with a plethora of disinformation and misinformation, crafted and concocted by clever minds looking more to extract money from you than help you to earn it. That latest "proven traffic system" that you just plopped down $997 for isn't going to bring  you the results you expected. That new video series by the latest raving internet marketer on how you can drive "unlimited" traffic to your website? Nope. That isn't going to work either.


Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
Utilize Social Media to build a relationship with your customer base - With the popularity of social media sites like Facebook and Twitter gaining momentum over the past few years, having a social media presence can be a positive extension to your web presence. Sharing content and company announcements via social media allows your customers to share your information within their own social circles through electronic word-of mouth. Social media also allows your customers to interact with you on a social level through comments, reviews and posts which makes your business both relatable and responsive to their needs.
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Guest blogging is a great way to generate free traffic – all you have to invest is the time to write an article. Get in touch with the most popular blogs in your industry and ask if they’ll let you write a guest post. Most website owners will have no objections to having other people write free content for them. (ask Michael how I know…wink, wink) If you’re having trouble finding blogs to guest post on check out www.myblogguest.com – they have a full community of bloggers that are ready and waiting for your content.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
Vary your article length. You should have long, comprehensive articles as well as short and to-the-point articles. Let the content dictate the size; don’t spend too long belaboring a simple point, but don’t be too brief when detail is called for. research suggests the average length should be around 1,600 words, though feel free to vary as you see fit.
Our products, including, but not limited to, themes and plugins, are created to be used by end users, including, but not limited to, designers, bloggers and developers for final work (personal and client websites). You can see what every license comes with on the Pricing Page. Our products only work on the self-hosted version of WordPress. You can’t use one of our themes or plugins on a WordPress.com blog. For more information on WordPress.com Vs WordPress.org, you can read here [http://en.support.wordpress.com/com-vs-org/].
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
I’ve always been one to create great content, but now I see it may not necessarily be the right content. Can Share Triggers work for all niches including things like plumbing companies, computer repair, maybe even handy men that have a website for their business? I would say I’m estimating half the views a month as I should. Hopefully some of these strategies will help.
I’d add one thing to number 5: Writing good copy is crucial not just for your Title/snippet, but for your whole page, especially your landing page. You want people to stay on your page for a while and (hopefully) even navigate to other pages you have. Google looks at bounce rate and where they go after they hit your page. Learning to write good copy can not only increase conversion (if you’re selling something) but make your content more impactful and engaging. There are free books at most libraries or online to help.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
Studies have proven that top placement in search engines generally provide a more favorable return on investment compared to traditional forms of advertising such as, snail mail, radio commercials and television. Search engine optimization is the primary method to earning top 10 search engine placement. Learn more about the search engine optimization process and discuss an SEO strategy for your site when you contact a search engine specialist today.
SEO is short for "search engine optimization." To have your site optimized for the search engines means to attempt to have top placement in the results pages whenever a specific keyword is typed into the query box. There are many search engine optimization services to choose from, so here are some things to keep in mind when seeking SEO services or developing an SEO strategy of your own.

Yep and sometimes it’s just being a little creative. I’ve started a little blog on seo/wordpress just for fun actually… no great content on it like here though… but because the competition is so tough in these niches I decided to take another approach. I created a few WordPress plugins that users can download for free from wordpress.org… and of course these link to my site so this gets me visitors each day.


When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
Thanks Jure. That actually makes sense. Exactly: I’ve tested lowering the number of tips in a few posts and it’s helped CTR/organic traffic. One thing to keep in mind is that the number can also be: the year, time (like how long it will take to find what someone needs), % (like 25% off) etc. It doesn’t have to be the number of tips, classified ads, etc.

Great article, Brian. Like that you’re finally talking about Domain Authority (DA). It’s essential to make skyscraper technique work as well. Also, a great pointer on comments as I have personally seen articles perform well because of comments. Do you recommend closing the comments as well a few days after the article is published? Kinda like Copyblogger does now.
Backlinks. If content is king, then backlinks are queen. Remember, it's not about which site has the most links, but who has the most quality links pointing back to their website. Build backlinks by submitting monthly or bi-monthly press releases on any exciting company, and contacting popular blogs in your niche to see how you can work together to get a backlink from their website. Create the best possible product site you can, so people talking about the products you sell will link back. Try creating graphics or newsworthy content that will influence bloggers and news websites to link that content.

Spider-driven search engines such as Google®, Yahoo!® and MSN® use "robots" or "crawlers" to score websites across the Internet. Robots "spider/crawl" each site and "score" pages based on how relevant they are. A website's score or placement within a spider driven search engine is derived from hundreds of variables such as link popularity, density and frequency of keywords in page content, HTML code, site themes and more. You will want to focus many criteria in your SEO strategy to position yourself well among the major search engines. Here are two of the most influential factors:
Hi! I really found this article to be valuable and helpful to improve our SEO techniques. But I am just wondering regarding the dead links, does that mean we can contact those who have dead links to recreate the page? How does it improve my SEO technique for my website? Can they add some citations or thank you or gratitude section that links to our website?
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
I’ve just taken the SEO role at my agency full time and, whilst it can be difficult at times, I am liking the challenge. I wonder if you had any suggestions when it came to finding “opportunity keywords” for term/subjects that don’t necessarily have massive search volumes associated to them? I use a few tools and utilise Google’s related terms already, but wondered if there were any tricks for finding new markets?
Social media. The algorithms have truly changed since social media first emerged. Many content websites are community-oriented -- Digg began allowing users to vote which stories make the front page, and YouTube factors views and user ratings into their front page rankings. Therefore, e-commerce stores must establish a strong social media presence on sites like Facebook , Pinterest, Twitter, etc. These social media sites send search engines signals of influence and authority.
In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.
Also make sure that your blog posts are consistent with one another and that each post has the same-sized images, headings and font. Always ensure that your blog post titles don’t lead your visitors astray.  This may seem obvious, but it happens more often than you’d think. For example, if your blog post is titled “The Top 10 Places to Hike in Southern California” but the post itself talks about hiking spots all throughout the entire state of California, you’re probably going to lose visitors. After all, it’s not what they had signed on for!
Building high-quality infographics is simply one form of clickbait. There are loads of clickbait examples that you can utilize, but not all of them are created equal. However, if you can build a great infographic that many people will link to, you can essentially create an automated marketing machine for your site. Ensure you hire a great designer to do this.
Brian, great post as always! Question: Do you consider authority sites (industry portals) a form of “influencer marketing?” e.g. guest blogging, etc? In some niches there are not so many individuals who are influencers (outside of journalists) but there are sites that those in the industry respect. I am in the digital video space and for me one site is actually a magazine that is building a very strong digital presence. Thanks, keep up the good work!

Thank you so much for these great SEO techniques you posted on your blog. I also follow you on your youtube and listened to almost all of your videos and sometimes I re-listen just to refresh my mind. Because of your techniques, we managed to bring our website to the first pages within a month. Adding external links was something I never imagined that it would work. But it seems like it is working. Anyway, please accept my personal thank you for coming up with and sharing these techniques. I look forward to your new blog posts and youtube videos!
In the early days of the web, site owners could rank high in search engines by adding lots of search terms to web pages, whether they were relevant to the website or not. Search engines caught on and, over time, have refined their algorithms to favor high-quality content and sites. This means that SEO is now more complex than just adding the right words to your copy.
Spider-driven search engines such as Google®, Yahoo!® and MSN® use "robots" or "crawlers" to score websites across the Internet. Robots "spider/crawl" each site and "score" pages based on how relevant they are. A website's score or placement within a spider driven search engine is derived from hundreds of variables such as link popularity, density and frequency of keywords in page content, HTML code, site themes and more. You will want to focus many criteria in your SEO strategy to position yourself well among the major search engines. Here are two of the most influential factors:
Español: aumentar el tráfico en un sitio web, Русский: увеличить посещаемость сайта, 中文: 增加网站流量, Deutsch: Die Zahl der Zugriffe auf Websites steigern, Français: augmenter le trafic de votre site web, Nederlands: Meer bezoekers naar je website trekken, Čeština: Jak zvýšit návštěvnost webových stránek, Bahasa Indonesia: Menaikkan Kunjungan ke Situs Web, العربية: زيادة حركة الزيارات على موقعك الإلكتروني, हिन्दी: वेबसाइट का ट्रैफिक बढ़ाएं, Tiếng Việt: Tăng Lượng truy cập Trang web
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
×