As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998. Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11]

Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.


Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports[23][24][25] by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter[26] in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
With the development of this system, the price is growing under the high level of competition. Many advertisers prefer to expand their activities, including increasing search engines and adding more keywords. The more advertisers are willing to pay for clicks, the higher the ranking for advertising, which leads to higher traffic.[15] PPC comes at a cost. The higher position is likely to cost $5 for a given keyword, and $4.50 for a third location. A third advertiser earns 10% less than the top advertiser, while reducing traffic by 50%.[15] The investors must consider their return on investment and then determine whether the increase in traffic is worth the increase.
Engagement in social media for the purpose of a social media strategy is divided into two parts. The first is proactive, regular posting of new online content. This can be seen through digital photos, digital videos, text, and conversations. It is also represented through sharing of content and information from others via weblinks. The second part is reactive conversations with social media users responding to those who reach out to your social media profiles through commenting or messaging.[22] Traditional media such as TV news shows are limited to one-way interaction with customers or 'push and tell' where only specific information is given to the customer with few or limited mechanisms to obtain customer feedback. Traditional media such as physical newspapers, do give readers the option of sending a letter to the editor. Though, this is a relatively slow process, as the editorial board has to review the letter and decide if it is appropriate for publication. On the other hand, social media is participative and open; Participants are able to instantly share their views on brands, products, and services. Traditional media gave control of message to the marketer, whereas social media shifts the balance to the consumer or citizen.
Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "search engine marketing" was popularized by Danny Sullivan in 2001[12] to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

Sharing Curated Links — While using social media for marketing is a great way to leverage your own unique, original content to gain followers, fans, and devotees, it’s also an opportunity to link to outside articles as well. If other sources provide great, valuable information you think your target audience will enjoy, don’t be shy about linking to them. Curating and linking to outside sources improves trust and reliability, and you may even get some links in return.
Facebook pages are far more detailed than Twitter accounts. They allow a product to provide videos, photos, longer descriptions, and testimonials where followers can comment on the product pages for others to see. Facebook can link back to the product's Twitter page, as well as send out event reminders. As of May 2015, 93% of businesses marketers use Facebook to promote their brand.[36] A study from 2011 attributed 84% of "engagement" or clicks and likes that link back to Facebook advertising.[37] By 2014, Facebook had restricted the content published from business and brand pages. Adjustments in Facebook algorithms have reduced the audience for non-paying business pages (that have at least 500,000 "Likes") from 16% in 2012 down to 2% in February 2014.[38] [39][40]
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.

Facebook pages are far more detailed than Twitter accounts. They allow a product to provide videos, photos, longer descriptions, and testimonials where followers can comment on the product pages for others to see. Facebook can link back to the product's Twitter page, as well as send out event reminders. As of May 2015, 93% of businesses marketers use Facebook to promote their brand.[36] A study from 2011 attributed 84% of "engagement" or clicks and likes that link back to Facebook advertising.[37] By 2014, Facebook had restricted the content published from business and brand pages. Adjustments in Facebook algorithms have reduced the audience for non-paying business pages (that have at least 500,000 "Likes") from 16% in 2012 down to 2% in February 2014.[38] [39][40]
Tablet - We consider tablets as devices in their own class, so when we speak of mobile devices, we generally do not include tablets in the definition. Tablets tend to have larger screens, which means that, unless you offer tablet-optimized content, you can assume that users expect to see your site as it would look on a desktop browser rather than on a smartphone browser.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]

Blogging website Tumblr first launched ad products on May 29, 2012.[69] Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site.[70] In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.


Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

In the social sphere, things change fast. New networks emerge, while others go through significant demographic shifts. Your business will go through periods of change as well. All of this means that your social media strategy should be a living document that you look at regularly and adjust as needed. Refer to it often to keep you on track, but don’t be afraid to make changes so that it better reflects new goals, tools, or plans.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×