What is a Search Engine? A search engine is a web-based tool that enables users to locate information on the World Wide Web. A search engine is a coordinated set of programs that includes: A†spider†(also called a "crawler" or a "bot") that goes to every page or representative pages on every Web site that wants to be searchable and reads it, using hypertext links on each page to discover and read a site's other pages A program that creates a huge index (sometimes called a "catalog") from the pages that have been read A program that receives your search request, compares it to the entries in the index, and returns results to you
Some well known and popular Search Engines are: Google, Yahoo, MSN, Bing.
Crawler-based search engines
Create their listings automatically by using a piece of software to ìcrawlî or ìspiderî the web and then index what it finds to build the search base Google, AllTheWeb and AltaVista
Depend on human editors to create their listings. Typically, webmasters submit a short description to the directory for their websites, or editors write one for the sites they review, and these manually edited descriptions will form the search base LookSmart, Open Directory
Transmit user-supplied keywords simultaneously to several individual search engines to actually carry out the search. Search results returned from all the search engines can be integrated, duplicates can be eliminated and additional features such as clustering by subjects within the search results can be implemented by meta-search engines. Dogpile, Mamma, and Metacrawler.
Please refer the following link for more information: http://pwebs.net/2011/04/search-engines-list/
There are three basic stages for a search engine:
Crawling - where content is discovered;
Indexing - where it is analysed and stored in huge databases
Retrieval -where a user query fetches a list of relevant pages
A spider, (robot or a crawler), a program that follows, or "crawls", links throughout the Internet, grabbing content from sites and adding it to search engine indexes. Spiders only can follow links from one page to another and from one site to another. Links to your website from other websites will give the search engine spiders more "food" to chew on. Spiders find Web pages by following links from other Web pages, but users can also submit web pages directly to a search engine or directory and request a visit by their spiders..
Each search engine has developed a formula for ranking web sites. Every search engine uses a different formula, which is why the results will be different between Google, Yahoo, Bing, Ask.com and other search engines. These formulas, known as algorithms, are constantly changing as the search engines strive to achieve improved results..
Search engine looks for signals to rank a page & itís necessary to tweak the content of your pages & give all the signals to the search engines and understand Keyword research to drive the traffic to your website. The amount of information on the world wide web is immense and you are at any time competing with thousands of websites for the attention and ranking of your website. The process of ranking the websites is complicated, algorithm driven and is getting more and more difficult for the SEO professionals..
Search engines do this by analyzing words and other content on web pages, placing special emphasis on words that appear on specific locations on the web page: the title, headlines, image attributes, overall content emphasis, outbound and inbound links, etc.
Every search engine has a different algorithm and method to rank the websites that their spiders visit. One of the parameters could be your geo-location and if you search for results anywhere in India, you could end up getting similar results.
Search engines have begun to give a lot of weightage to your presence on Social Media. The Social Media signal could be your posts on Facebook or LinkedIn, or there could be someone who has added your website/blog as their favourite on their FB page, or blog, or added your website's link in their Tweet. These signals have found a lot of credibility with search engines today.
Relevancy and search results
When a searcher types some 'keywords' in the search engine for what s/he is looking for, the search engine's algorithm matches the words in its vast database based on the relevance of the keywords. It may be a word, or a combination of words or phrases that are relevant. Based on these algorithms, the search engine then gives the results of the search. However, it is the algorithm that works and one may not get the exact page that one is looking for. But studies also show that most people who search for something via a search engine donít go past the first page of search results.
More clicks mean more page views, more page exposure, more revenue, and a greater recognition of authority in whatever field the site might be placed in. Obviously, getting a front page search result is an optimal target for anyone who is looking to get their product, application, or website in front of the people who are interested in it.
Google doesn't let people know what those factors are, however, through a combination of research, testing and experience, a good Google SEO consultant knows what the most important factors are. For example, most SEOs would agree that the following are all important ranking factors.
Keyword usage : The design of your keywords in the website, not only in your metatags, but also in the website content.
Site structure: Is your website well structured, does it have seamless navigation, does it lead to any missing pages? These are the factors that could also affect your website ranking.
Site speed: You need to have optimized graphics, images and videos if you have any on your website. If you have heavy graphics, it could lead to very poor download time negatively affecting your ranking.
Time spent on site: Is your website/webpage attractive enough? Research shows that a user / visitor may decide within half a second whether to remain on the website or click away. Second to attractiveness is the content. If the user finds that the content is not what s/he expected, the user will simply go to the next site. When users spend more time on your website, your ranking could definitely go up.
Number of inbound links
How many 'other' website/pages/blogs have links that direct their users to your website? These are called backlinks and is one of the important factors affecting your website ranking. Quality of inbound links Is the quality of the backlink good? Meaning, is the website that is linking back to your website relevant to the subject of your website?
What is Keyword research? Keyword research is the process of identifying Keywords, which get the good amount of search volume/month. The idea behind performing a research is to, find the keywords that can give maximum traffic and comes with less competition..
Keyword research is the process of identifying Keywords, which get the good amount of search volume/month. The idea behind performing a research is to, find the keywords that can give maximum traffic and comes with less competition..
SEO is the process by which a SEO specialist, a web master, web designer takes certain steps to improve the 'page rank' of the website so it appears on the first page or the first 10 ranks of the search engine results. Search engines keep constantly evolving. So the SEO specialist has to continiously stay updated in the changes that the search engines make in their algorithms. However, most of the times, it is more of guesswork, as the search engines do not reveal what exactly they change in their algorithms. What is the importance of search for websites? The best way for people to find any content or a website is using search engines. People do this by typing a few words or phrases in the search bar and getting the results on the Search Engines Result Page or SERP. The difficult part for the SEO specialist is getting into the top ranking of the search results. A study shows that 1st rank on the SERP gets around 31% of clicks which goes down to 3% by the 10th position. On the second page, the results get hardly 1-2% clicks.
The perfectly optimized website will have the right blend of keyword usage with content that flows naturally. The website will also have a lot of natural backlinks. These are links to your website from other websites. The search engine uses this information to determine which sites get ranked the highest.
In the past, following were processess that were popular with the SEO professionals:
1. Do a lot of Keyword research to explore phrases that are hig in demand and low in competition.
2. Make pages with highly optimized keyword phrases.
3. Get / ask people to create a lot of backlinks, use anchors on pages, etc.
These steps and processes did work in the past. However, Google constantly changes its algorithm through what it calls Panda and Penguin updates, hence, these methods, though not completely redundant, hold very less value.
Three very important terms to understand in SEO especially in terms of Google rankings are:
White Hat SEO - approved strategies for getting your page to rank well. Google offers guidelines to webmasters which spell out approved SEO strategies.
Black Hat SEO - these are the ìloopholesî that Google are actively seeking out and penalizing for. They include a whole range of strategies from on-page keyword stuffing to backlink blasts using software to generate tens of thousands of backlinks to a webpage.
Grey Hat SEO - Strategies that lie between the two extremes. These are strategies that Google do not approve of, but are less likely to get your site penalized than black hat. Grey hat tactics are certainly riskier than white hat SEO, but not as risky as black hat..
1. Quality of Content Search engines, especially google looks for high quality of content, content that will be relevant to a search query, or phrases that are searched more. Longer pages with quality content, pictures, videos help retain the visitors interest and makes them stay on the page for longer periods of time.
2. Page Loading Time More download time due to heavy grapics, is a big negative. Visitors will NOT wait on your page, especially those who have come in through a random search. This is where the term 'bounce' and 'exit' come into play. People will simply click away and not come back, leading to higher bounce rates.
3. Internal Links from Other Pages on the Site All links in a website, leading to pages within the website are aclled internal links, as explained before. These help the visitor to navigate better within the website, and find the information they are looking for quickly.
4. Bounce Rates "Bounce" happens when a visitor clicks a link on the SERPs and then returns to Google. The quicker the click-away, the worse it is for your website, as it informs Google that the visitor was not satisfied. If more such instances occur, your 'Google' reputation suffers. It tells Google that the visitor did not find relevant information. Higher Bounce rates are inversely with relevance.
5. Time a Visitor Stays on Your Page / Site. Time that visitors stay on web pages is monitored by Google mainly through the Google Analytics platform. Google Analytics is a free web analytics service for siteowners, which tracks and reports your website traffic, giving Google to accurately track the traffic on your website. Google analytics tracks how the visitor has arrived at your website, the time spent, the number of pages visited, the operating system and also the device the visitors use to visit your page. .
1. Click-through Rates (CTR) Webmasters/SEO specialists should pay a lot of attention to this factor. It is necessary that the content of the page matches the ranking it has obtained. Suppose, by using a loophole in Google's algorithm, a developer is able to get to the 1st rank, and yet, does not get the desired number of clicks, Google starts moving the page down the order. Conversely, if a lower ranked page has got content that is more desirable than the one at an upper rank, google moves it upwards.
2. Social Signals Google has started recognizing your presence on Social Media. If your website link is shared, or you have good presence and content on social media with a good number of visitors and followers, your social media page could be ranked higher than even your website. However, social signals are recognized only to a limited extent. One note: Many developers ignore Google+. Remember, afterall, it belongs to Google and will tend to be recognized by Google better and sooner than other social media.
3. Backlinks Just because a lot of other websites and pages have a back link to your page/website does not necessarily mean that your page will be ranked higher. Google algorithm looks at the quality of the pages that have a back link to your page. So if a CNN, NDTV, or a ZEE page has a link to your site, it will be with more authorithy whereas even if hundreds of low quality websites provide a link to your page, it may be actually considered as spamming, or manipulation and you might not be ranked at all. .
Seasoned SEO professional in his book SEO 2016 and Beyond says, "I have been doing SEO for over 10 years now and have always concentrated on long term strategies. Thatís not to say I havenít dabbled in black hat SEO because I have, a little. Over the years, I have done a lot of experiments on all kinds of ranking factors. However, and without exception, all of the sites I promoted with black hat SEO have been penalized; every single one of them. In this book, I donít want to talk about the murkier SEO strategies that will eventually cause you problems, so Iíll concentrate on the safer techniques of white hat SEO".
He divides SEO into four main pillars. These are:
1. Quality content
2. Site organization
4. Whatís in it for the visitor?
These are the four areas where you need to concentrate your efforts, so letís now have a look at each of them in turn. The algorithm is designed and set-up by humans, however, the rankings given to websites are wholly determined by the outcome of the algorithm.
There is no manual intervention by humans to adjust the rankings specific websites are given by the algorithm. The website ranked in 1st place is the website that the algorithm has given the best score to when taking into account the 200+ factors.
Google is constantly reviewing, adjusting and updating its search results, so a website that is ranked 1st today could potentially not even be on the 1st page next week.
There is no magic button that an SEO can press that guarantees a no.1 ranking, however, by paying attention to the factors that the algorithm places value on, and actively working to improve them, they can guarantee to improve your websiteís rankings in Google..
Keyword Planner is a free AdWords tool for new or experienced advertisers thatís like a workshop for building new Search Network campaigns or expanding existing ones. You can search for keyword and ad group ideas, see how a list of keywords might perform, and even create a new keyword list by multiplying several lists of keywords together. Keyword Planner can also help you choose competitive bids and budgets to use with your campaigns. Benefits You can use Keyword Planner to accomplish the following tasks: Research keywords. Need help finding keywords to add to a new campaign or find additional keywords to add to an existing campaign. You can search for keyword and ad groups ideas based on terms that are relevant to your product or service, your landing page, or different product categories. Get historical statistics and traffic forecasts. Use statistics like search volume to help you decide which keywords to use for a new or existing campaign. Get forecasts, like predicted clicks and estimated conversions, to get an idea of how a list of keywords might perform for a given bid and budget. These forecasts can also help guide your decision on which bids and budgets to set. It's important to keep in mind that while Keyword Planner can provide some great keyword ideas and traffic forecasts, campaign performance depends on a variety of factors. For example, your bid, budget, product, and customer behavior in your industry can all influence the success of your campaigns.
From an SEO point of view, a site is URL structure should be:
Straightforward: URLs with duplicate content should have canonical (preferred) URLs specified for them; there should be no confusing redirects on the site, etc.
Meaningful: URL names should have keywords in them, not gibbering numbers and punctuation marks. With an emphasis on the right URLs: SEO-wise, not all URLs on a site are of equal importance as a rule. Some even should be concealed from the search engines. At the same time, it is important to check that the pages that ought to be accessible to the search engines are actually open for crawling and indexing.
So, here is what one can do to achieve an SEO-friendly site URL structure:
1. Consolidate www & non-www Domain Versions As a rule, there are two major versions of your domain indexed in the search engines, the www and the non-www version of it. These can be consolidated in more than one way, but Iíd mention the most widely accepted practice. Most SEOs use the 301 redirect to point one version of their site to the other.
2. Avoid Dynamic & Relative URLs Depending on your content management system, the URLs it generates may be ìprettyî like this one: www.example.com/topic-name or 'ugly' like this one: www.example.com/?p=578544 Google recommends using hyphens (-) instead of underscores (_) in URL names, since a phrase in which the words are connected using underscores is treated by Google as one single word, e.g. one_single_word is onesingleword to Google.
3. Create an XML Sitemap An XML Sitemap is not to be confused with the HTML sitemap. The former is for the search engines, while the latter is mostly designed for human users. What is an XML Sitemap? In plain words, itís a list of your siteís URLs that you submit to the search engines. This helps search engines find your siteís pages more easily;
4. Close Off Irrelevant Pages with robots.txt There may be pages on your site that should be concealed from the search engines. These could be your ìTerms and Conditionsî page, pages with sensitive information, etc. It is better not to let these get indexed, since they usually donít contain your target keywords and only dilute the semantic whole of your site. The robotx.txt file contains instructions for the search engines as to what pages of your site should be ignored during the crawl. Such pages get a noindex attribute and do not show up in the search results.
5. Specify Canonical URLs Using a Special Tag Another way to highlight canonical URLs on your site is by using the so-called canonical tag. In geek speak, itís not the tag itself that is canonical, but the tagís parameter, but weíll just call it the canonical tag by metonymy.
Final Thoughts on URL Structure
Having SEO-friendly URL structure on a site means having a URL structure that helps your site rank higher in the search results. From the point of view of web development a particular siteís architecture may seem crystal-clear and error-free, but for an SEO manager this could mean missing on certain ranking opportunities..
Folders should be named to broadly reflect the top level categories (for example: mortgages). Sub pages should be named with keywords specific to each page (for example: 'interest-only' 'buy-to-let' or 'best-deals').
Keep the length of URL length short by saving all relevant files within one top level folder. Google has already been through the ëmortgagesí folder and knows all the subsequent pages are mortgage-related. You donít need to keep repeating the main keyword.
Separate keywords using a hyphen rather than an underscore; Google doesn't recognise underscores as separators...
How Google Meta Tags Impact SEO
What Are Meta Tags?
Meta tags are coded into the HTML of the web page and not visible directly when you visit a web page. To view the meta tags of a web page, right click on a page and click on 'View Source'. Meta tags describe the content of the website using 'Keywords' and 'Description'. They tell the search engines what the webstite is about.
Do Meta Tags Help SEO?
Meta tags do help in page ranking but only to some extent. The ranking algorithms of Google for example has started ignoring the meta tags because they can be manipulated to earn the higher rank in a search engine results page.
Meta tags could look like these:
<meta name="description" content="PGDM programs at Deviprasad Goenka Management College offers various specializations designed to cater to the ever changing needs of Global Media & Advertising industry.">
<meta name="keywords" content="mba in media, media management, pgdm in media, dgmc">
There are four major types of meta tags. Some are still useful for SEO, while some have become redundant.
Meta Keywords Attribute - A number of jeywords that best suite your website topic.
Title Tag - This is the title that is seen on the top of the browser window. This is a useful tag as Google does index the title; however, Google also picks up keywords from you textual content.
Meta Description Attribute - A brief description of the page.
Meta Robots Attribute - An indication to search engine crawlers (robots or "bots") as to what they should do with the page.
Meta Keywords Attribute Meta robots attribute: This tag informs the search engine whether it should be indexed or not..
Redirection is the process of forwarding one URL to a different URL. There are three main kinds of redirects: 301, 302, and meta refresh.
Types of Redirects
301, "Moved Permanently"—recommended for SEO
302, "Found" or "Moved Temporarily"
What is a Redirect?
A redirect is a way to send both users and search engines to a different URL from the one they originally requested. Below are descriptions of some of the commonly used types of redirects.
301 - Moved Permanently
A status code of 301 tells a client that the resource they asked for has permanently moved to a new location. The response should also include this location. It tells the client to use the new URL the next time it wants to fetch the same resource.
302 - Found
A status code of 302 tells a client that the resource they asked for has temporarily moved to a new location. The response should also include this location. It tells the client that it should carry on using the same URL to access this resource.
Meta refreshes are a type of redirect executed on the page level rather than the server level. They are usually slower, and not a recommended SEO technique. They are most commonly associated with a five-second countdown with the text "If you are not redirected in five seconds, click here."
Google is continually tweaking and revising the way it indexes content.
While it does publish clues about its algorithm updates, it rarely comes clean about all of its reasons for changes. Fixing things can be tough.
For more details go to the specially created chapter of why Google penalizes from the Home Page
Difference between Google Panda and Google Penguin
For those who deal with SEO and blogging, Google panda and Google penguin should not be something new to them. Ever since these updates were introduced, there has been strong buzz on how to develop the perfect search engine optimization plan, so that websites have their dues in rankings. Moving forward, here is a close take Panda and penguin, the two most discussed Google updates, and the difference between the two.
What is Google Panda
Google launched Panda in February 2011, mainly as a change in its search results ranking algorithm. The main purpose of the update was just to keep those low quality and low content sites away from the top ranking results and give the actual quality sites their due.
As an obvious result, many websites with huge amount of advertising, or those with low quality content, saw a huge decline in the rankings. Ever since Panda was launched, there have been many updates to it ranging to over 22 in total.
What is Google Penguin
Another algorithm update from Google that gave SEO experts another blow was Penguin that was launched in April 2012. The idea underlying the update was simple enough- penalize and decrease the rankings of sites that breach Google’s Webmaster Guidelines set the by the search engine.
This included lowering the search engine rankings of all those sites that practice black-hat SEO techniques like duplicate content, keyword stuffing and cloaking to name a few.