Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Marketers target influential people on social media who are recognised as being opinion leaders and opinion-formers to send messages to their target audiences and amplify the impact of their message. A social media post by an opinion leader can have a much greater impact (via the forwarding of the post or "liking" of the post) than a social media post by a regular user. Marketers have come to the understanding that "consumers are more prone to believe in other individuals" who they trust (Sepp, Liljander, & Gummerus, 2011). OL's and OF's can also send their own messages about products and services they choose (Fill, Hughes, & De Francesco, 2013, p. 216). The reason the opinion leader or formers have such a strong following base is because their opinion is valued or trusted (Clement, Proppe, & Rott, 2007). They can review products and services for their followings, which can be positive or negative towards the brand. OL's and OF's are people who have a social status and because of their personality, beliefs, values etc. have the potential to influence other people (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189). They usually have a large number of followers otherwise known as their reference, membership or aspirational group (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189. By having an OL or OF support a brands product by posting a photo, video or written recommendation on a blog, the following may be influenced and because they trust the OL/OF a high chance of the brand selling more products or creating a following base. Having an OL/OF helps spread word of mouth talk amongst reference groups and/or memberships groups e.g. family, friends, work-friends etc. (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189).[81][82][83][84][84][84] The adjusted communication model shows the use of using opinion leaders and opinion formers. The sender/source gives the message to many, many OL's/OF's who pass the message on along with their personal opinion, the receiver (followers/groups) form their own opinion and send their personal message to their group (friends, family etc.) (Dahlen, Lange, & Smith, 2010, p. 39).[85]
Since social media marketing first came to be, strategists and markets have been getting smarter and more careful with the way they go about collecting information and distributing advertisements. With the presence of data collecting companies, there is no longer a need to target specific audiences. This can be seen as a large ethical gray area. For many users, this is a breach of privacy, but there are no laws that prevent these companies from using the information provided on their websites. Companies like Equifax, Inc., TransUnion Corp, and LexisNexis Group thrive on collecting and sharing personal information of social media users.[107] In 2012, Facebook purchased information from 70 million households from a third party company called Datalogix. Facebook later revealed that they purchased the information in order to create a more efficient advertising service.[108]
Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×