Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Look at your short- and long-term goals to choose whether to focus on organic or paid search (or both). It takes time to improve your organic search rankings, but you can launch a paid search campaign tomorrow. However, there are other considerations: the amount of traffic you need, your budget, and your marketing objectives. Once you’ve reviewed the pros and cons, you can select the search strategy that’s right for you.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Social networking websites are based on building virtual communities that allow consumers to express their needs, wants and values, online. Social media marketing then connects these consumers and audiences to businesses that share the same needs, wants, and values. Through social networking sites, companies can keep in touch with individual followers. This personal interaction can instill a feeling of loyalty into followers and potential customers. Also, by choosing whom to follow on these sites, products can reach a very narrow target audience. Social networking sites also include much information about what products and services prospective clients might be interested in. Through the use of new semantic analysis technologies, marketers can detect buying signals, such as content shared by people and questions posted online. An understanding of buying signals can help sales people target relevant prospects and marketers run micro-targeted campaigns.
Disney/Pixar's Monsters University: Created a Tumblr account, MUGrumblr, saying that the account is maintained by a 'Monstropolis transplant' and 'self-diagnosed coffee addict' who is currently a sophomore at Monsters University. A "student" from Monsters University uploaded memes, animated GIFs, and Instagram-like photos that are related to the sequel movie.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.