As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine the higher the website ranks in the search engine results page (SERP). These visitors can then be converted into customers.[4]
On April 24, 2012 many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank. The Google Update was called Penguin. Since then, there have been several different Penguin/Panda updates rolled out by Google. SEM has, however, nothing to do with link buying and focuses on organic SEO and PPC management. As of October 20, 2014 Google has released three official revisions of their Penguin Update.
Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities.[31] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges.[32] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[31] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[31] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.
Social media marketing involves the use of social networks, consumer's online brand-related activities (COBRA) and electronic word of mouth (eWOM)[75][76] to successfully advertise online. Social networks such as Facebook and Twitter provide advertisers with information about the likes and dislikes of their consumers.[61] This technique is crucial, as it provides the businesses with a "target audience".[61] With social networks, information relevant to the user's likes is available to businesses; who then advertise accordingly. Activities such as uploading a picture of your "new Converse sneakers to Facebook[75]" is an example of a COBRA.[75][76] Electronic recommendations and appraisals are a convenient manner to have a product promoted via "consumer-to-consumer interactions.[75] An example of eWOM would be an online hotel review;[77] the hotel company can have two possible outcomes based on their service. A good service would result in a positive review which gets the hotel free advertising via social media. However, a poor service will result in a negative consumer review which can potentially harm the company's reputation[78].
Facebook Ads and other social media ad platforms, for example, are pay-per-click platforms that do not fall under the SEM category. Instead of showing your ads to people who are searching for similar content like search ads do, social media sites introduce your product to people who happen to be just browsing through their feeds. These are two very, very different types of online advertising.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports[23][24][25] by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter[26] in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.
Paid inclusion is a search engine marketing method in itself, but also a tool of search engine optimization, since experts and firms can test out different approaches to improving ranking and see the results often within a couple of days, instead of waiting weeks or months. Knowledge gained this way can be used to optimize other web pages, without paying the search engine company.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

Planned content begins with the creative/marketing team generating their ideas, once they have completed their ideas they send them off for approval. There is two general ways of doing so. The first is where each sector approves the plan one after another, editor, brand, followed by the legal team (Brito, 2013). Sectors may differ depending on the size and philosophy of the business. The second is where each sector is given 24 hours (or such designated time) to sign off or disapprove. If no action is given within the 24-hour period the original plan is implemented. Planned content is often noticeable to customers and is un-original or lacks excitement but is also a safer option to avoid unnecessary backlash from the public.[87] Both routes for planned content are time consuming as in the above; the first way to approval takes 72 hours to be approved. Although the second route can be significantly shorter it also holds more risk particularly in the legal department.
Another ethical controversy associated with search marketing has been the issue of trademark infringement. The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years. In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term.[27] Though the policy has been changed this continues to be a source of heated debate.[28]

Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.

Disney/Pixar's Monsters University: Created a Tumblr account, MUGrumblr, saying that the account is maintained by a 'Monstropolis transplant' and 'self-diagnosed coffee addict' who is currently a sophomore at Monsters University.[73] A "student" from Monsters University uploaded memes, animated GIFs, and Instagram-like photos that are related to the sequel movie.
Facebook had an estimated 144.27 million views in 2016, approximately 12.9 million per month.[109] Despite this high volume of traffic, very little has been done to protect the millions of users who log on to Facebook and other social media platforms each month. President Barack Obama tried to work with the Federal Trade Commission (FTC) to attempt to regulate data mining. He proposed the Privacy Bill of Rights, which would protect the average user from having their private information downloaded and shared with third party companies. The proposed laws would give the consumer more control over what information companies can collect.[107] President Obama was unable to pass most of these laws through congress, and it is unsure what President Trump will do with regards to social media marketing ethics.
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]