In traditional marketing you're looking to define your targeted audience for your business or organisation. In Internet marketing things work in the same way. Unfortunately, with the growing popularity of the Internet in the past years and with the growing number of people building sites, a certain part of the online audience has been overlooked.
Search engine optimization or SEO is very important to get your website listed in search engines. Even if this is the first website you have built there are a few basic and easy steps that will help you with optimizing your website without being a pro.
The SEOs with white hats conduct legitimate optimising of web pages to make the site come up appropriately in the Search Engine Results Pages (also called SERPs). The back hat SEOs implement tricks to appear high in the results pages even if the web site is not necessarily relevant. The range of tricks is astonishing. But most of the techniques used by white hat SEOs were similar if not identical to the guidelines given by accessibility experts.
This guide is designed to describe all areas of search engine optimization - from discovery of the terms and phrases that will generate traffic, to making a site search engine friendly, to building the links and marketing the unique value of the site/organization's offerings.
Desperation, ignorance, and a moral compass that doesn’t point due north often get perfectly logical, good people and companies in trouble with search engines. Because being listed high in search results is such a desirable goal to attain, many people search for shortcuts to the front of the line—which can land them in serious trouble.
The only effective way to promote a website is by hosting unique, quality content. Search engine optimization and paid inclusions are a waste of time and money if there isn't a compelling reason for your visitors to come back once they have found you.
Over the past year we have worked with a number of organizations that have chosen to relocate their sites from an existing domain to a new domain. One of the questions that always comes up early in the process is “how much traffic are we going to lose?” It is an excellent question and not an easy one to answer, but in today’s column I am going to explore that exact question.
I don’t “really” know anything about SEO. What I do know is the folks at Google and other big search engines are just human beings like us who have created and constantly tweak the search algorithms. Their goal is to give us what we want when searching, the best possible websites relevant to what we are searching for. So let’s set aside all the fancy technical stuff and just use some good ol’ common sense.
The updating of massive indexes by Google is not a smooth affair by any means. Notably, as a result of updating process, old indexes do not simply yield to new indexes, but there is quite an haphazard movement in transition. It takes a couple of days for Google to complete its update. Especially during this period, both old and new indexes get their place on www.google.com, albeit alternatively or even in unpredictable ways before new indexes stabilize there for all to see. The fluctuations witnessed on Google between transition from old indexes to new indexes seem as if Google were dancing. Hence, in SEO parlance comes the word Google Dance. Varying indexes have a say in the final rankings just when PageRank calculation sets in action. So, the fluctuating indexes of your site should not be a cause of concern when Google is dancing. Wait for Google to come to a halt and you will see all the things stabilize.
There are many reasons to use mod_rewrite to create informative, useful URLs for your website. Most dynamic websites use some form of PHP or ASP to pull the data from the database and often times use that data in the URL as a string. This is not only a potential security flaw, it also gives the user and search engine alike a very uninformative destination for your website.
On the desktop Web, ecommerce landing pages get a bum rap—sometimes well deserved. Laden with ads and gimmicks, pushing items with higher markups, and confusing customers with complicated information architectures, these marketing monstrosities typically strongly underperform the search results pages from a simple keyword search. However, passing a death sentence on all landing pages may be premature. On the small screens of mobile devices, well-designed landing pages can provide a much better experience than keyword search results. Currently, few mobile sites use landing pages, which makes them the next big mobile ecommerce opportunity.
Small websites get less traffic than big ones, but they can still dominate their niches. For each question users ask, the Web delivers a different set of sites to provide the answers.
We consider the problem of dust: Different URLs with Similar Text. Such duplicate URLs are prevalent in web sites, as web server software often uses aliases and redirections, and dynamically generates the same page from various different URL requests. We present a novel algorithm, DustBuster, for uncovering dust; that is, for discovering rules that transform a given URL to others that are likely to have similar content. DustBuster mines dust effectively from previous crawl logs or web server logs, without examining page contents. Verifying these rules via sampling requires fetching few actual web pages. Search engines can benefit from information about dust to increase the effectiveness of crawling, reduce indexing overhead, and improve the quality of popularity statistics such as PageRank.
Shorter topics do add more little targets in the field. So the user has a higher chance of hitting one of the targets, but it’s unlikely that any of the targets will provide the exact answer the user is looking for.
A findability strategy cheat sheet that will guide you through all of the stuff you should be doing when creating new websites or even redesign existing ones.
Website visitors do not arrive magically… they follow recommendations from others, such as links, display ads, or even offline word of mouth. As 2007 turns into 2008, here are 5 easy ways to substantially increase the amount of traffic coming to your site.
You can communicate information about your site to search engines and see your site from their perspective using some free services and utilities from Yahoo! and Google.
The Google Sandbox is a filter that was put in place in about March of 2004. New websites with new domain names can take 6 to 12 months to get decent rankings on Google. Some are reporting stays of up to 18 months. The Sandbox seems to affect nearly all new websites placing them on probation. Similarly, websites that have made comprehensive redesigns have been caught up in this Sandbox. Does this Sandbox Really Exist, or is it just part of the Google algorithm? This has been a big controversy with many different opinions. Most now believe that this is an algorithm. In either case, the Sandbox functions to keep new sites from shooting to the top of Google in just a few weeks and overtaking quality sites that have been around for many years. This appears to be an initiation period for new websites.
Google's increasing use of anti-spam features has meant that optimising websites for Google has become much harder and it's now not just a case of opening your websites source files in notepad, adding some keywords into your various HTML tags, uploading your files and waiting for the results. In fact in my opinion and I'm sure others will agree with me, this type of optimisation, commonly referred to as onpage optimisation will only ever be 20% effective at achieving rankings for any keywords which are even mildly competitive. Those of us who aced maths in school will know this leaves us with 80% unaccounted for.
Welcome to Google's Search Engine Optimization Starter Guide. This document first began as an effort to help teams within Google, but we thought it'd be just as useful to webmasters that are new to the topic of search engine optimization and wish to improve their sites' interaction with both users and search engines. Although this guide won't tell you any secrets that'll automatically rank your site first for queries in Google (sorry!), following the best practices outlined below will make it easier for search engines to both crawl and index your content.
While many niches depend on PPC search traffic, there’s a wide group of sites that benifit from bought traffic from individual sites. Often times you can get very high quality traffic that converts very well from niches that tend to deal in a more direct site to site type traffic deal, rather than 3rd party ad networks. This guide is mostly to be used when buying traffic from forums, from individual websites, and from “plug” type packages, yet there are many things that transfer over to more traditional PPC outlets.