Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.


Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Traditional advertising techniques include print and television advertising. The Internet has already overtaken television as the largest advertising market.[90] Web sites often include the banner or pop-up ads. Social networking sites don't always have ads. In exchange, products have entire pages and are able to interact with users. Television commercials often end with a spokesperson asking viewers to check out the product website for more information. While briefly popular, print ads included QR codes on them. These QR codes can be scanned by cell phones and computers, sending viewers to the product website. Advertising is beginning to move viewers from the traditional outlets to the electronic ones.[citation needed]
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
Look at your short- and long-term goals to choose whether to focus on organic or paid search (or both). It takes time to improve your organic search rankings, but you can launch a paid search campaign tomorrow. However, there are other considerations: the amount of traffic you need, your budget, and your marketing objectives. Once you’ve reviewed the pros and cons, you can select the search strategy that’s right for you.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Disney/Pixar's Monsters University: Created a Tumblr account, MUGrumblr, saying that the account is maintained by a 'Monstropolis transplant' and 'self-diagnosed coffee addict' who is currently a sophomore at Monsters University.[73] A "student" from Monsters University uploaded memes, animated GIFs, and Instagram-like photos that are related to the sequel movie.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.
In 2012 during Hurricane Sandy, Gap sent out a tweet to its followers telling them to stay safe but encouraged them to shop online and offered free shipping. The tweet was deemed insensitive, and Gap eventually took it down and apologized.[96] Numerous additional online marketing mishap examples exist. Examples include a YouTube video of a Domino's Pizza employee violating health code standards, which went viral on the Internet and later resulted in felony charges against two employees.[93][97] A Twitter hashtag posted by McDonald's in 2012 attracting attention due to numerous complaints and negative events customers experienced at the chain store; and a 2011 tweet posted by a Chrysler Group employee that no one in Detroit knows how to drive.[98] When the Link REIT opened a Facebook page to recommend old-style restaurants, the page was flooded by furious comments criticizing the REIT for having forced a lot of restaurants and stores to shut down; it had to terminate its campaign early amid further deterioration of its corporate image.[99]

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.

Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
In May 2014, Instagram had over 200 million users. The user engagement rate of Instagram was 15 times higher than of Facebook and 25 times higher than that of Twitter.[50] According to Scott Galloway, the founder of L2 and a professor of marketing at New York University's Stern School of Business, latest studies estimate that 93% of prestige brands have an active presence on Instagram and include it in their marketing mix.[51] When it comes to brands and businesses, Instagram's goal is to help companies to reach their respective audiences through captivating imagery in a rich, visual environment.[52] Moreover, Instagram provides a platform where user and company can communicate publicly and directly, making itself an ideal platform for companies to connect with their current and potential customers.[53]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
×