Yesterday I was re doing our process for ideas and alltop was a part of it. Now I have also known it was a bit spammy (some of my grey sites are featured ) but now it seems way too bad. You have places like new York times next to random adsense blog x. Guy kawasaki needs to really start giving some sort of influence ranking or at least culling the total crap ones.

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.


Brian, I’ve drunk your Kool aid! Thank you for honesty and transparency – it really gives me hope. Quick question: I am beyond passionate about a niche (UFOs, extraterrestrials, free energy) and know in my bones that an authority site is a long term opportunity. The problem today is that not many products are attached to this niche and so it becomes a subscriber / info product play. However, after 25+ years as an entrepreneur with a financial background and marketing MBA, am I Internet naive to believe that my passion and creativity will win profitability in the end? The target audience is highly passionate too. Feedback?
In fact, as stipulated by law, we can not and do not make any guarantees about your ability to get results or earn any money with our ideas, information, tools or strategies. We don’t know you and, besides, your results in life are up to you. Agreed? Your results will be impacted by numerous factors not limited to your experience, background, discipline and conscientiousness. Always do your own due diligence and use your own judgment when making buying decisions and investments for yourself or in your business.

Well, the age of print media is coming to a close. But there’s no reason why some enterprising blogger couldn’t use the same tactic to get new subscribers. Let’s say you have a lifestyle blog targetting people in San Francisco. You could promote the giveaway through local media, posters, and many other tactics (we’ll get into these methods shortly).


Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.
Great post, your knowledge and innovative approach never fails to amaze me! This is certainly the first time I’ve heard someone suggest the Wikipedia dead link technique. It’s great that you’re getting people to think outside of the box. Pages like reddit are great for getting keywords and can also be used for link building although this can be difficult to get right. Even if you don’t succeed at using it to link build it’s still a really valuable platform for getting useful information. Thanks!
The idea of “link bait” refers to creating content that is so extremely useful or entertaining it compels people to link to it. Put yourself in the shoes of your target demographic and figure out what they would enjoy or what would help them the most. Is there a tool you can make to automate some tedious process? Can you find enough data to make an interesting infographic? Is there a checklist or cheat sheet that would prove handy to your audience? The possibilities are nearly endless – survey your visitors and see what is missing or lacking in your industry and fill in the gaps.
There are many times when you post a small quote or a phrase in your blog post that you believe people would love to tweet. ClickToTweet helps you do just that. Simple create a pre-made Tweet on ClickToTweet.com, generate a unique, and put it on your website so that people can just click it to tweet it. Sounds simple. It is, and it is one of the most popular strategies for generating buzz on Twitter.
Not sure exactly why, perhaps I used a number too big and since my page is about classifieds, it probably seemed too much to browse through 1500 ads, I assume? Somewhat like you would post 800 tips for better ranking? Don’t know, will try to change things a bit and see how it goes, but you really gave me some new suggestions to go for with this article. Thanks again 🙂
Lets just say that out of the 200 clicks, you received 3 sales, which were tracked with a Facebook conversion pixel. Those 3 sales resulted in $800 in revenue. So your $100 investment just drove $800 in sales. Now, this is simply a generic example , but when you know how to track your ads or other marketing efforts, then you'll know what's paying off and what's not.
Relevancy is the first qualifier of a quality link opportunity. The next qualifying factor is the authority of the opportunity. Since Google doesn’t update PageRank (PR) anymore, you must rely on third party metrics. I recommend you use Domain Authority (DA) from Open Site Explorer, Domain Rate (DR) from Ahrefs, or Trust Flow from Majestic to determine the quality of your link opportunities. You should use all three tools if you can.

YouTube is a great resource for driving free organic traffic to your website. Maybe it's because Google loves YouTube, and considering that it's the second most popular search engine in the world, gaining exposure on YouTube could be huge. Create useful tutorials and videos that add an immense amount of value and be sure to link to your content through the description.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
×