In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Another excellent guide is Google’s “Search Engine Optimization Starter Guide.” This is a free PDF download that covers basic tips that Google provides to its own employees on how to get listed. You’ll find it here. Also well worth checking out is Moz’s “Beginner’s Guide To SEO,” which you’ll find here, and the SEO Success Pyramid from Small Business Search Marketing.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.
More than three billion people in the world are active on the Internet. Over the years, the Internet has continually gained more and more users, jumping from 738 million in 2000 all the way to 3.2 billion in 2015. Roughly 81% of the current population in the United States has some type of social media profile that they engage with frequently. Mobile phone usage is beneficial for social media marketing because of their web browsing capabilities which allow individuals immediate access to social networking sites. Mobile phones have altered the path-to-purchase process by allowing consumers to easily obtain pricing and product information in real time. They have also allowed companies to constantly remind and update their followers. Many companies are now putting QR (Quick Response) codes along with products for individuals to access the company website or online services with their smart phones. Retailers use QR codes to facilitate consumer interaction with brands by linking the code to brand websites, promotions, product information, and any other mobile-enabled content. In addition, Real-time bidding use in the mobile advertising industry is high and rising due to its value for on-the-go web browsing. In 2012, Nexage, a provider of real time bidding in mobile advertising reported a 37% increase in revenue each month. Adfonic, another mobile advertisement publishing platform, reported an increase of 22 billion ad requests that same year.
LinkedIn, a professional business-related networking site, allows companies to create professional profiles for themselves as well as their business to network and meet others. Through the use of widgets, members can promote their various social networking activities, such as Twitter stream or blog entries of their product pages, onto their LinkedIn profile page. LinkedIn provides its members the opportunity to generate sales leads and business partners. Members can use "Company Pages" similar to Facebook pages to create an area that will allow business owners to promote their products or services and be able to interact with their customers. Due to spread of spam mail sent to job seeker, leading companies prefer to use LinkedIn for employee's recruitment instead using different a job portal. Additionally, companies have voiced a preference for the amount of information that can be gleaned from a LinkedIn profile, versus a limited email.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.