Yelp consists of a comprehensive online index of business profiles. Businesses are searchable by location, similar to Yellow Pages. The website is operational in seven different countries, including the United States and Canada. Business account holders are allowed to create, share, and edit business profiles. They may post information such as the business location, contact information, pictures, and service information. The website further allows individuals to write, post reviews about businesses, and rate them on a five-point scale. Messaging and talk features are further made available for general members of the website, serving to guide thoughts and opinions.[49]
In 2013, the Tenth Circuit Court of Appeals held in Lens.com, Inc. v. 1-800 Contacts, Inc. that online contact lens seller Lens.com did not commit trademark infringement when it purchased search advertisements using competitor 1-800 Contacts' federally registered 1800 CONTACTS trademark as a keyword. In August 2016, the Federal Trade Commission filed an administrative complaint against 1-800 Contacts alleging, among other things, that its trademark enforcement practices in the search engine marketing space have unreasonably restrained competition in violation of the FTC Act. 1-800 Contacts has denied all wrongdoing and is scheduled to appear before an FTC administrative law judge in April 2017.[29]
Blogging website Tumblr first launched ad products on May 29, 2012.[69] Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site.[70] In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.
The code of ethics that is affiliated with traditional marketing can also be applied to social media. However, with social media being so personal and international, there is another list of complications and challenges that come along with being ethical online. With the invention of social media, the marketer no longer has to focus solely on the basic demographics and psychographics given from television and magazines, but now they can see what consumers like to hear from advertisers, how they engage online, and what their needs and wants are.[101] The general concept of being ethical while marking on social network sites is to be honest with the intentions of the campaign, avoid false advertising, be aware of user privacy conditions (which means not using consumers' private information for gain), respect the dignity of persons in the shared online community, and claim responsibility for any mistakes or mishaps that are results of your marketing campaign.[102] Most social network marketers use websites like Facebook and MySpace to try to drive traffic to another website.[103] While it is ethical to use social networking websites to spread a message to people who are genuinely interested, many people game the system with auto-friend adding programs and spam messages and bulletins. Social networking websites are becoming wise to these practices, however, and are effectively weeding out and banning offenders.

Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
WhatsApp was founded by Jan Koum and Brian Acton.WhatsApp joined Facebook in 2014, but continues to operate as a separate app with a laser focus on building a messaging service that works fast and reliably anywhere in the world.WhatsApp started as an alternative to SMS. Whatsapp now supports sending and receiving a variety of media including text, photos, videos, documents, and location, as well as voice calls. Whatsapp messages and calls are secured with end-to-end encryption, meaning that no third party including WhatsApp can read or listen to them. Whatsapp has a customer base of 1 billion people in over 180 countries.[46][47] It is used to send personalised promotional messages to individual customers. It has plenty of advantages over SMS that includes ability to track how Message Broadcast Performs using blue tick option in Whatsapp. It allows sending messages to Do Not Disturb(DND) customers. Whatsapp is also used to send a series of bulk messages to their targeted customers using broadcast option. Companies started using this to a large extent because it is a cost effective promotional option and quick to spread a message. Still, Whatsapp doesn't allow businesses to place ads in their app.[48]

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×