The code of ethics that is affiliated with traditional marketing can also be applied to social media. However, with social media being so personal and international, there is another list of complications and challenges that come along with being ethical online. With the invention of social media, the marketer no longer has to focus solely on the basic demographics and psychographics given from television and magazines, but now they can see what consumers like to hear from advertisers, how they engage online, and what their needs and wants are.[101] The general concept of being ethical while marking on social network sites is to be honest with the intentions of the campaign, avoid false advertising, be aware of user privacy conditions (which means not using consumers' private information for gain), respect the dignity of persons in the shared online community, and claim responsibility for any mistakes or mishaps that are results of your marketing campaign.[102] Most social network marketers use websites like Facebook and MySpace to try to drive traffic to another website.[103] While it is ethical to use social networking websites to spread a message to people who are genuinely interested, many people game the system with auto-friend adding programs and spam messages and bulletins. Social networking websites are becoming wise to these practices, however, and are effectively weeding out and banning offenders.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
However, while bidding $1,000 on every keyword and ranking #1 for every relevant search sounds nice in theory, most businesses have to play a balancing game between ranking higher and paying too much for clicks. After all, if it costs $17.56 to rank in position #1, but you can only afford to pay $5.00 per click, bidding $1,000 on a keyword to guarantee yourself the #1 position would be a great way to bid yourself out of business.

All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Facebook pages are far more detailed than Twitter accounts. They allow a product to provide videos, photos, longer descriptions, and testimonials where followers can comment on the product pages for others to see. Facebook can link back to the product's Twitter page, as well as send out event reminders. As of May 2015, 93% of businesses marketers use Facebook to promote their brand.[36] A study from 2011 attributed 84% of "engagement" or clicks and likes that link back to Facebook advertising.[37] By 2014, Facebook had restricted the content published from business and brand pages. Adjustments in Facebook algorithms have reduced the audience for non-paying business pages (that have at least 500,000 "Likes") from 16% in 2012 down to 2% in February 2014.[38] [39][40]
There are many reasons explaining why advertisers choose the SEM strategy. First, creating a SEM account is easy and can build traffic quickly based on the degree of competition. The shopper who uses the search engine to find information tends to trust and focus on the links showed in the results pages. However, a large number of online sellers do not buy search engine optimization to obtain higher ranking lists of search results, but prefer paid links. A growing number of online publishers are allowing search engines such as Google to crawl content on their pages and place relevant ads on it.[16] From an online seller's point of view, this is an extension of the payment settlement and an additional incentive to invest in paid advertising projects. Therefore, it is virtually impossible for advertisers with limited budgets to maintain the highest rankings in the increasingly competitive search market.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×