A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Social media can be used not only as public relations and direct marketing tools but also as communication channels targeting very specific audiences with social media influencers and social media personalities and as effective customer engagement tools.[15] Technologies predating social media, such as broadcast TV and newspapers can also provide advertisers with a fairly targeted audience, given that an ad placed during a sports game broadcast or in the sports section of a newspaper is likely to be read by sports fans. However, social media websites can target niche markets even more precisely. Using digital tools such as Google Adsense, advertisers can target their ads to very specific demographics, such as people who are interested in social entrepreneurship, political activism associated with a particular political party, or video gaming. Google Adsense does this by looking for keywords in social media user's online posts and comments. It would be hard for a TV station or paper-based newspaper to provide ads that are this targeted (though not impossible, as can be seen with "special issue" sections on niche issues, which newspapers can use to sell targeted ads).
Sponsored radar – Radar picks up exceptional posts from the whole Tumblr community based on their originality and creativity. It is placed on the right side next to the Dashboard, and it typically earns 120 million daily impressions. Sponsored radar allows advertisers to place their posts there to have an opportunity to earn new followers, reblogs, and likes.
In 2007, U.S. advertisers spent US $24.6 billion on search engine marketing.[3] In Q2 2015, Google (73.7%) and the Yahoo/Bing (26.3%) partnership accounted for almost 100% of U.S. search engine spend.[4] As of 2006, SEM was growing much faster than traditional advertising and even other channels of online marketing.[5] Managing search campaigns is either done directly with the SEM vendor or through an SEM tool provider. It may also be self-serve or through an advertising agency. As of October 2016, Google leads the global search engine market with a market share of 89.3%. Bing comes second with a market share of 4.36%, Yahoo comes third with a market share of 3.3%, and Chinese search engine Baidu is fourth globally with a share of about 0.68%.[6]

Mobile devices have become increasingly popular, where 5.7 billion people are using them worldwide [13]. This has played a role in the way consumers interact with media and has many further implications for TV ratings, advertising, mobile commerce, and more. Mobile media consumption such as mobile audio streaming or mobile video are on the rise – In the United States, more than 100 million users are projected to access online video content via mobile device. Mobile video revenue consists of pay-per-view downloads, advertising and subscriptions. As of 2013, worldwide mobile phone Internet user penetration was 73.4%. In 2017, figures suggest that more than 90% of Internet users will access online content through their phones.[14]


Facebook pages are far more detailed than Twitter accounts. They allow a product to provide videos, photos, longer descriptions, and testimonials where followers can comment on the product pages for others to see. Facebook can link back to the product's Twitter page, as well as send out event reminders. As of May 2015, 93% of businesses marketers use Facebook to promote their brand.[36] A study from 2011 attributed 84% of "engagement" or clicks and likes that link back to Facebook advertising.[37] By 2014, Facebook had restricted the content published from business and brand pages. Adjustments in Facebook algorithms have reduced the audience for non-paying business pages (that have at least 500,000 "Likes") from 16% in 2012 down to 2% in February 2014.[38] [39][40]

Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities.[31] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges.[32] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[31] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[31] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]

Another ethical controversy associated with search marketing has been the issue of trademark infringement. The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years. In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term.[27] Though the policy has been changed this continues to be a source of heated debate.[28]
The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Search engines reward you when sites link to yours – they assume that your site must be valuable and you’ll rank higher in search results. And the higher the “rank” of the sites that link to you, the more they count in your own ranking. You want links from popular industry authorities, recognized directories, and reputable companies and organizations.
Planned content begins with the creative/marketing team generating their ideas, once they have completed their ideas they send them off for approval. There is two general ways of doing so. The first is where each sector approves the plan one after another, editor, brand, followed by the legal team (Brito, 2013). Sectors may differ depending on the size and philosophy of the business. The second is where each sector is given 24 hours (or such designated time) to sign off or disapprove. If no action is given within the 24-hour period the original plan is implemented. Planned content is often noticeable to customers and is un-original or lacks excitement but is also a safer option to avoid unnecessary backlash from the public.[87] Both routes for planned content are time consuming as in the above; the first way to approval takes 72 hours to be approved. Although the second route can be significantly shorter it also holds more risk particularly in the legal department.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×