If you're looking to upload an image to a blog post, for example, examine the file for its file size first. If it's anywhere in megabyte (MB) territory, even just 1 MB, it's a good idea to use an image compression tool to reduce the file size before uploading it to your blog. Sites like TinyPNG make it easy to compress images in bulk, while Google's very own Squoosh has been known to shrink image file sizes to microscopic levels.
Use clean backgrounds. The background textures and color you choose have the ability to drastically affect the overall appeal of the website. Lots of texture and graphics in the background can be distracting. If you are going to use a color on the background, you should make sure there is significant contrast between the background color and the text. Be careful when using brighter and darker colors such as red or yellow. They cause visual fatigue (temporary loss of strength and energy resulting from hard physical or mental work) and the reader will lose their focus on the text.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.

In addition to optimizing these six areas of your site, analyze your competitors and see what they are doing in terms of on-page optimization, off-page optimization (competitive link analysis) and social media. While you may be doing a lot of the same things they are, it’s incredibly important to think outside the box to get a leg up over the competition.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Of course, we are always thinking about cost/value/likelihood we can upgrade the best content in the vertical—it is almost always the case that the low competition content, although lower benefit, also doesn’t need the same content quality the high competition terms do, so we can sometimes capture more benefit at a faster velocity by hitting those terms earlier.

“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”

Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.


Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
Building high-quality infographics is simply one form of clickbait. There are loads of clickbait examples that you can utilize, but not all of them are created equal. However, if you can build a great infographic that many people will link to, you can essentially create an automated marketing machine for your site. Ensure you hire a great designer to do this.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
This was very interesting. I run a website that promotes sports entertainment amongst teenagers who are graphic designers or video editors. The foundation is in place (Over 60 contributors) so my only focus is how to blog consistently about what goes on in the sports world with appeal to teenagers. I am confident i took a huge step today after learning these 4 steps!
Btw, I was always under the impression that digg and delicious were dying but I’m really mistaken. Your(and Jason’s) thinking is foolproof though. If these guys are already curating content, there’s no reason they wouldn’t want to do more of just that! Seo has become a lot of chasing and pestering…it’s good of you to remind us that there are people out there just waiting to share stuff, too.:)
Hi! I really found this article to be valuable and helpful to improve our SEO techniques. But I am just wondering regarding the dead links, does that mean we can contact those who have dead links to recreate the page? How does it improve my SEO technique for my website? Can they add some citations or thank you or gratitude section that links to our website?

Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.

Medium is one of my go-to platforms for marketing my content and provides another authority-site domain that gives you the versatility of link-dropping the way that you normally would through any blog post on a CMS like WordPress. Leverage Medium to create intuitive content marketing posts that also link back to your primary posts on your site or blog.


All the products are the property of MyThemeShop so you may not claim ownership (intellectual or exclusive) over any of our products, modified or unmodified. Our products come ‘as is’, without any kind of warranty, either expressed or implied. Under no circumstances can our juridical person be accountable for any damages including, but not limited to, direct, indirect, special, incidental or consequential damages or other losses originating from the employment of or incapacity to use our products.​
Thanks Brian for your article. I am in the healthy living niche. I want to team up with bloggers in my own niche where we can share material it makes sense to me. But I have my own unique message and that is what I have been devoted to! Dah! I see now that my focus should be on what is popular among my peers and add to this. I think I’m finally getting the picture! I am specifically into FOOD MEDICINE perhaps I should start writting about the dangers of a Gluten free diet! Not for everyone!

You hereby indemnify Us and undertake to keep Us indemnified against any losses, damages, costs, liabilities and expenses (including, without limitation, legal expenses and any amounts paid by Us to a third party in settlement of a claim or dispute on the advice of Our legal advisers) incurred or suffered by Us arising out of any breach by You of any provision of these terms of use.

In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Google Analytics is free to use, and the insights gleaned from it can help you to drive further traffic to your website. Use tracked links for your marketing campaigns and regularly check your website analytics. This will enable you to identify which strategies and types of content work, which ones need improvement, and which ones you should not waste your time on.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.

Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.

The Services are created and controlled by MyThemeShop. in the State of Illinois, U.S.A. You agree that these Terms of Use will be governed by and construed in accordance with the laws of the United States of America and the State of Illinois, without regard to its conflicts of law provisions. Use of the Services is unauthorized in any jurisdiction that does not give effect to all provisions of these Terms of Use. MyThemeShop, LLC makes no claims or assurances that the Services are appropriate or may be downloaded outside of the United States. You agree that all legal proceedings arising out of or in connection with these Terms of Use or the Services must be filed in a federal or state court located in Libertyville, Illinois, within one year of the time in which the events giving rise to such claim began, or your claim will be forever waived and barred. You expressly submit to the exclusive jurisdiction of said courts and consent to extraterritorial service of process.


For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
×