These posts can be one or more of the following: images, photo sets, animated GIFs, video, audio, and text posts. For the users to differentiate the promoted posts to the regular users' posts, the promoted posts have a dollar symbol on the corner. On May 6, 2014, Tumblr announced customization and theming on mobile apps for brands to advertise.[72]
Reddit, or similar social media platforms such as Stumble Upon or Digg, are ideal for sharing compelling content. With over 2 billion page views a month, Reddit has incredible social media marketing potential, but marketers should be warned that only truly unique, interesting content will be welcomed. Posting on Reddit is playing with fire—submit spammy or overtly sales-focused content and your business could get berated by this extremely tech-savvy community.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Social media marketing involves the use of social networks, consumer's online brand-related activities (COBRA) and electronic word of mouth (eWOM)[75][76] to successfully advertise online. Social networks such as Facebook and Twitter provide advertisers with information about the likes and dislikes of their consumers.[61] This technique is crucial, as it provides the businesses with a "target audience".[61] With social networks, information relevant to the user's likes is available to businesses; who then advertise accordingly. Activities such as uploading a picture of your "new Converse sneakers to Facebook[75]" is an example of a COBRA.[75][76] Electronic recommendations and appraisals are a convenient manner to have a product promoted via "consumer-to-consumer interactions.[75] An example of eWOM would be an online hotel review;[77] the hotel company can have two possible outcomes based on their service. A good service would result in a positive review which gets the hotel free advertising via social media. However, a poor service will result in a negative consumer review which can potentially harm the company's reputation[78].
There’s no denying that a lot of social media is a matter of trial-and-error. Monitoring the metrics behind your campaigns in real-time allows you to make small tweaks to your social media marketing strategy rather than sweeping, time-consuming changes. This dynamic approach to marketing makes perfect sense in a day and age where social media is constantly evolving.
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure. Make sure all of the pages on your site are reachable through links, and that they don't require an internal "search" functionality to be found. Link to related pages, where appropriate, to allow users to discover similar content.
Blogging website Tumblr first launched ad products on May 29, 2012.[69] Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site.[70] In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.

Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.


You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×