Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]
Facebook and LinkedIn are leading social media platforms where users can hyper-target their ads. Hypertargeting not only uses public profile information but also information users submit but hide from others.[17] There are several examples of firms initiating some form of online dialog with the public to foster relations with customers. According to Constantinides, Lorenzo and Gómez Borja (2008) "Business executives like Jonathan Swartz, President and CEO of Sun Microsystems, Steve Jobs CEO of Apple Computers, and McDonalds Vice President Bob Langert post regularly in their CEO blogs, encouraging customers to interact and freely express their feelings, ideas, suggestions, or remarks about their postings, the company or its products".[15] Using customer influencers (for example popular bloggers) can be a very efficient and cost-effective method to launch new products or services[18] Among the political leaders in office, Prime Minister Narendra Modi has the highest number of followers at 40 million, and President Donald Trump ranks second with 25 million followers.[19] Modi employed social media platforms to circumvent traditional media channels to reach out to the young and urban population of India which is estimated to be 200 million.

SEM is the wider discipline that incorporates SEO. SEM includes both paid search results (using tools like Google Adwords or Bing Ads, formerly known as Microsoft adCenter) and organic search results (SEO). SEM uses paid advertising with AdWords or Bing Ads, pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done. A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time. SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices.

When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
Social media platforms like Yelp and FourSquare are great for brick and mortar businesses looking to implement marketing on social media. Register on these sites to claim your location spot, and then consider extra incentives such as check-in rewards or special discounts. Remember, these visitors will have their phones in hand, so they will be able to write and post reviews. A lot of good reviews can significantly help sway prospective visitors to come in and build your business!
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.
Instagram has proven itself a powerful platform for marketers to reach their customers and prospects through sharing pictures and brief messages. According to a study by Simply Measured, 71% of the world's largest brands are now using Instagram as a marketing channel.[58] For companies, Instagram can be used as a tool to connect and communicate with current and potential customers. The company can present a more personal picture of their brand, and by doing so the company conveys a better and true picture of itself. The idea of Instagram pictures lies on on-the-go, a sense that the event is happening right now, and that adds another layer to the personal and accurate picture of the company. In fact, Thomas Rankin, co-founder and CEO of the program Dash Hudson, stated that when he approves a blogger's Instagram post before it is posted on the behalf of a brand his company represents, his only negative feedback is if it looks too posed. "It's not an editorial photo," he explained, "We're not trying to be a magazine. We're trying to create a moment."[57] Another option Instagram provides the opportunity for companies to reflect a true picture of the brandfrom the perspective of the customers, for instance, using the user-generated contents thought the hashtags encouragement.[59] Other than the filters and hashtags functions, the Instagram's 15-second videos and the recently added ability to send private messages between users have opened new opportunities for brands to connect with customers in a new extent, further promoting effective marketing on Instagram.
Websites such as Delicious, Digg, Slashdot, Diigo, Stumbleupon, and Reddit are popular social bookmarking sites used in social media promotion. Each of these sites is dedicated to the collection, curation, and organization of links to other websites that users deem to be of good quality. This process is "crowdsourced", allowing amateur social media network members to sort and prioritize links by relevance and general category. Due to the large user bases of these websites, any link from one of them to another, the smaller website may in a flash crowd, a sudden surge of interest in the target website. In addition to user-generated promotion, these sites also offer advertisements within individual user communities and categories.[62] Because ads can be placed in designated communities with a very specific target audience and demographic, they have far greater potential for traffic generation than ads selected simply through cookie and browser history.[63] Additionally, some of these websites have also implemented measures to make ads more relevant to users by allowing users to vote on which ones will be shown on pages they frequent.[64] The ability to redirect large volumes of web traffic and target specific, relevant audiences makes social bookmarking sites a valuable asset for social media marketers.
This involves tracking the volume of visits, leads, and customers to a website from the individual social channel. Google Analytics[110] is a free tool that shows the behavior and other information, such as demographics and device type used, of website visitors from social networks. This and other commercial offers can aid marketers in choosing the most effective social networks and social media marketing activities.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]

Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.


Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
While most search engine companies try to keep their processes a secret, their criteria for high spots on SERPs isn't a complete mystery. Search engines are successful only if they provide a user links to the best Web sites related to the user's search terms. If your site is the best skydiving resource on the Web, it benefits search engines to list the site high up on their SERPs. You just have to find a way to show search engines that your site belongs at the top of the heap. That's where search engine optimization (SEO) comes in -- it's a collection of techniques a webmaster can use to improve his or her site's SERP position.
Great Social Content — Consistent with other areas of online marketing, content reigns supreme when it comes to social media marketing. Make sure you post regularly and offer truly valuable information that your ideal customers will find helpful and interesting. The content that you share on your social networks can include social media images, videos, infographics, how-to guides and more.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
×