Sponsored radar – Radar picks up exceptional posts from the whole Tumblr community based on their originality and creativity. It is placed on the right side next to the Dashboard, and it typically earns 120 million daily impressions. Sponsored radar allows advertisers to place their posts there to have an opportunity to earn new followers, reblogs, and likes.
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
While most of the links to your site will be added gradually, as people discover your content through search or other ways and link to it, Google understands that you'd like to let others know about the hard work you've put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject. As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.
In May 2014, Instagram had over 200 million users. The user engagement rate of Instagram was 15 times higher than of Facebook and 25 times higher than that of Twitter. According to Scott Galloway, the founder of L2 and a professor of marketing at New York University's Stern School of Business, latest studies estimate that 93% of prestige brands have an active presence on Instagram and include it in their marketing mix. When it comes to brands and businesses, Instagram's goal is to help companies to reach their respective audiences through captivating imagery in a rich, visual environment. Moreover, Instagram provides a platform where user and company can communicate publicly and directly, making itself an ideal platform for companies to connect with their current and potential customers.
Facebook pages are far more detailed than Twitter accounts. They allow a product to provide videos, photos, longer descriptions, and testimonials where followers can comment on the product pages for others to see. Facebook can link back to the product's Twitter page, as well as send out event reminders. As of May 2015, 93% of businesses marketers use Facebook to promote their brand. A study from 2011 attributed 84% of "engagement" or clicks and likes that link back to Facebook advertising. By 2014, Facebook had restricted the content published from business and brand pages. Adjustments in Facebook algorithms have reduced the audience for non-paying business pages (that have at least 500,000 "Likes") from 16% in 2012 down to 2% in February 2014. 
Websites such as Delicious, Digg, Slashdot, Diigo, Stumbleupon, and Reddit are popular social bookmarking sites used in social media promotion. Each of these sites is dedicated to the collection, curation, and organization of links to other websites that users deem to be of good quality. This process is "crowdsourced", allowing amateur social media network members to sort and prioritize links by relevance and general category. Due to the large user bases of these websites, any link from one of them to another, the smaller website may in a flash crowd, a sudden surge of interest in the target website. In addition to user-generated promotion, these sites also offer advertisements within individual user communities and categories. Because ads can be placed in designated communities with a very specific target audience and demographic, they have far greater potential for traffic generation than ads selected simply through cookie and browser history. Additionally, some of these websites have also implemented measures to make ads more relevant to users by allowing users to vote on which ones will be shown on pages they frequent. The ability to redirect large volumes of web traffic and target specific, relevant audiences makes social bookmarking sites a valuable asset for social media marketers.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.
Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities. The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges. These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers. To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers. This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.