Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) primarily through paid advertising.[1] SEM may incorporate search engine optimization (SEO), which adjusts or rewrites website content and site architecture to achieve a higher ranking in search engine results pages to enhance pay per click (PPC) listings.[2]
One of the main purposes of employing social media in marketing is as a communications tool that makes the companies accessible to those interested in their product and makes them visible to those who have no knowledge of their products.[26] These companies use social media to create buzz, and learn from and target customers. It's the only form of marketing that can finger consumers at each and every stage of the consumer decision journey.[27] Marketing through social media has other benefits as well. Of the top 10 factors that correlate with a strong Google organic search, seven are social media dependent. This means that if brands are less or non-active on social media, they tend to show up less on Google searches.[28] While platforms such as Twitter, Facebook, and Google+ have a larger number of monthly users, the visual media sharing based mobile platforms, however, garner a higher interaction rate in comparison and have registered the fastest growth and have changed the ways in which consumers engage with brand content. Instagram has an interaction rate of 1.46% with an average of 130 million users monthly as opposed to Twitter which has a .03% interaction rate with an average of 210 million monthly users.[28] Unlike traditional media that are often cost-prohibitive to many companies, a social media strategy does not require astronomical budgeting.[29]
Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]


Popular social media such as Facebook, Twitter, LinkedIn, and other social networks can provide marketers with a hard number of how large their audience is nevertheless a large audience may not always translate into a large sales volumes. Therefore, an effective SMM cannot be measured by a large audience but rather by vigorous audience activity such as social shares, re-tweets etc.

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
In 2013, the Tenth Circuit Court of Appeals held in Lens.com, Inc. v. 1-800 Contacts, Inc. that online contact lens seller Lens.com did not commit trademark infringement when it purchased search advertisements using competitor 1-800 Contacts' federally registered 1800 CONTACTS trademark as a keyword. In August 2016, the Federal Trade Commission filed an administrative complaint against 1-800 Contacts alleging, among other things, that its trademark enforcement practices in the search engine marketing space have unreasonably restrained competition in violation of the FTC Act. 1-800 Contacts has denied all wrongdoing and is scheduled to appear before an FTC administrative law judge in April 2017.[29]
AdWords is recognized as a web-based advertising utensil since it adopts keywords which can deliver adverts explicitly to web users looking for information in respect to a certain product or service. It is flexible and provides customizable options like Ad Extensions, access to non-search sites, leveraging the display network to help increase brand awareness. The project hinges on cost per click (CPC) pricing where the maximum cost per day for the campaign can be chosen, thus the payment of the service only applies if the advert has been clicked. SEM companies have embarked on AdWords projects as a way to publicize their SEM and SEO services. One of the most successful approaches to the strategy of this project was to focus on making sure that PPC advertising funds were prudently invested. Moreover, SEM companies have described AdWords as a practical tool for increasing a consumer’s investment earnings on Internet advertising. The use of conversion tracking and Google Analytics tools was deemed to be practical for presenting to clients the performance of their canvas from click to conversion. AdWords project has enabled SEM companies to train their clients on the utensil and delivers better performance to the canvass. The assistance of AdWord canvass could contribute to the growth of web traffic for a number of its consumer’s websites, by as much as 250% in only nine months.[30]
In addition to their own in-house promotional photos, the brand pushes a lot of user-generated content. Perhaps most notable is the fact that Ben & Jerry’s does not shy away from activism and politically-charged posts. Some might argue that politics and ice cream don’t mix, but the brand has generated plenty of buzz by putting their values front-and-center.

Several customers are turning towards social media to express their appreciation or frustration with brands, product or services. Therefore, marketers can measure the frequency of which customers are discussing their brand and judge how effective their SMM strategies are. In recent studies, 72% of people surveyed expressed that they expected a response to their complaints on Twitter within an hour.[111]
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
Social media often feeds into the discovery of new content such as news stories, and “discovery” is a search activity. Social media can also help build links that in turn support into SEO efforts. Many people also perform searches at social media sites to find social media content. Social connections may also impact the relevancy of some search results, either within a social media network or at a ‘mainstream’ search engine.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.

Keyword research and analysis involves three "steps": ensuring the site can be indexed in the search engines, finding the most relevant and popular keywords for the site and its products, and using those keywords on the site in a way that will generate and convert traffic. A follow-on effect of keyword analysis and research is the search perception impact.[13] Search perception impact describes the identified impact of a brand's search results on consumer perception, including title and meta tags, site indexing, and keyword focus. As online searching is often the first step for potential consumers/customers, the search perception impact shapes the brand impression for each individual.


You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×