Web Development and SEO Company Bangalore

How to make JavaScript sites SEO friendly?

seo-websites

With JavaScript being more integrated with HTML on the web, it has become important that SEOs make the JavaScript sites search-friendly. Here are some of the ways in which you can turn such sites as search-friendly.

Whenever there is a client-side JavaScript SEO issue, you can take the following steps:

  • Search for search-engine bots, either by looking at the URL (in case you used a #! Hashbang) or by checking the user agent of the request.
  • In case you have detected the bot, you need to redirect the request to your rendering engine of choice, like Phantom JS etc. This engine should wait until all the AJAX content is loaded.
  • Once the content is loaded, you must take the source of the rendered page and output it back to the bot.

This whole procedure is a form of cloaking, which is often refrained in SEO, yet as the content is almost the same as what users would see – this kind of cloaking is considered “ethical”. Therefore the search engines accept it. Google has a whole set of AJAX crawling specification, which covers all the important points one need to take care of. At the end, one needs to either add #! identifier to the URL or use the HTML tag as the page header. This whole process is called pre-rendering snapshot pages and bot-specific rendering. Once you do this, Google gets an alert that the page uses URL fragments and gets them to the index page.

Things to take care of while pre-rendering

There are some potential issues which you need to be careful about. They are:

  • Snapshot timing
  • Page load time
  • Bot detection
  • Batch processing

Check the Snapshot pages

As you are using bot detection, it becomes hard to verify whether your process is actually working.  One way in which you can check out if the process is actually working is through “fetch as google” feature in Google Webmaster Tools. This process requires a live page, thus it is advisable that you plan accordingly. At present, “Fetch as Google” supports #! only, and not pushState URLs. So if your URLs are static looking, you will not face any problem.

Landing pages and paid search

Usually JavaScript sites face challenges with SEO as well as with paid search. For instance, AdWords determine the quality score based on the content the AdWord bot sees on the page. Thus one way in which you can address this issue is by serving the snapshot page to the Google AdsBot.

Another point to be noted is that your products or product content is found on a single page application, it becomes difficult to force it through the paid search destination URL. Therefore it is imperative that you create #! or static looking URL to address this issue. Also paid search landing pages needs to be highly tailored so as to covert well. Thus it is advisable that you create dedicated pages for your PPC campaigns, leaving your core JavaScript Web experience to users and SEO.

One thing which we need to keep in mind is that JavaScript-heavy sites are going to stay. Though it is a challenge to build a JavaScript site which works well in SEO, it’s even more challenging to try to fix an existing site which was built without keeping SEO in mind. Unless the frameworks and tools are improved to make it easy to incorporate SEO requirements, SEOs will have to work closely with developers so that SEO is factored in. Though getting JavaScript and SEO together is not an easy task, bringing then together is harmony is sure to bring great benefits.

REQUEST QUOTE

BANGALORE OFFICE

Jain Technosoft
#123, 2nd Floor, 24th Main,
JP Nagar 5th Phase,
Opp. Royal High School,
Bangalore - 560078,
Karnataka, India.

AHMEDABAD OFFICE

Jain Technosoft
A-304, Pinnacle Business Park,
Opp. Prahlad Nagar Auda Garden,
Corporate Road, Ahmedabad - 380015,
Gujarat, India.
Member of BNI: Business Network International
REQUEST QUOTE