Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉
Search engine optimisation or SEO, has become a huge priority for marketers over the last few years. It’s easy to see why—higher search engine rankings result in more traffic, more leads, and higher sales and conversions. But how, exactly, does it work? How does adding keywords to various site elements improve your chances of ranking well in search engines?
Start browsing through articles in the same category as your content. Like the articles you genuinely like, and downvote the ones you’re not interested in. Do this for a few minutes every day.This step is very important – StumbleUpon uses the data to learn what kind of content you like. When you submit content, StumbleUpon will show it to other users who like the same kind of content.Act like your ideal reader, and that’s who StumbleUpon will share your content with.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Thanks Brain, these tips are useful. The key thing with most of the tips that you provided is that it will take time and most people want to have more traffic, but they do not want to do the work and put in the time. However, if you put in the word and you do a quality job then it will work out. I think that is the overall strategies that a lot of SEOs have to do today is just to take the time and figure out quality strategies.
So many businesses are focused on attracting new customers through content marketing that they forget about more traditional methods. Email marketing can be a powerful tool, and even a moderately successful email blast can result in a significant uptick in traffic. Just be careful not to bombard people with relentless emails about every single update in your business. Also, don’t overlook the power of word-of-mouth marketing, especially from people who are already enjoying your products or services. A friendly email reminder about a new service or product can help you boost your traffic, too.
Commenting on blog posts written by industry experts with lots of followers can bring your website a lot of traffic. When you post a comment (most) blogs allow you to leave a link back to your site for other readers to check out – as long as you leave an insightful comment you WILL get traffic from your blog comments. Make sure you comment as quickly as possible when new blog posts go up. The higher in the comments you are the more clicks you’ll get. I have Google Reader setup to alert me when new blog posts are made on the industry blogs I follow and I comment immediately to lock in my first place spot.
In this excellent post, SEO and Digital Trends in 2017, Gianluca Fiorelli writes, "In a mobile-only world, the relevance of local search is even higher. This seems to be the strategic reason both for an update like Possum and all the tests we see in local, and also of the acquisition of a company like Urban Engines, whose purpose is to analyze the "Internet of Moving Things."
The intent behind “SEO agency” is obvious… The searcher is looking for an SEO agency. Most of these searchers aren’t looking for life lessons from an SEO agency owner. Instead, they are just looking for the best SEO agency to get them more traffic and customers from Google. Plain and simple. I knew this when I created that page, but my SEO ego was too big.
Sending out regular newsletters and promoting offers through email is a great way to stay in touch with your customers and can also help to get traffic to your website. Provide useful information and links to pages on your website where they can learn more, such as through blog posts and landing pages for particular offers. Just make sure that you don`t continually bombard your readers with emails or your customers will either disengage with, delete, or unsubscribe from your emails.
Hey Ted, thanks for the great questions! The peak times refer to your particular time zone, if you are targeting an audience that resides in the same zone as you. You can also use tools to find out when most of your audience is online. For example, Facebook has this built into their Page Insights. For Twitter, you can use https://followerwonk.com/. Many social posting tools also offer this functionality.
SEMRush has a relatively new feature that allows you to quickly see the highest-trafficked pages for a given domain. It’s a bit buried, so can be easy to miss, but it’s a no-brainer shortcut to quickly unveil the topics with massive traffic. Unfortunately it doesn’t immediately give you traffic or traffic cost, but one extra step will solve that for you.
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.