Generating Website Content with Scripting: A Comparison between Client-Side and Server-Side Automation

Overview

This article compares client-side and server-side scripts for generating website content.

Prerequisite

Mandatory: Website design using HTML, CSS, Javascript and PHP.

Flexible: Basic knowledge of search engines and crawlers.

Search Engines and Crawlers

A search engine indexes web page and ranks them. Example: Duckduckgo, Google, Bing, Yahoo, Yandex, etc. Search engines use crawlers to discover new web pages and contents. Crawling web pages on internet is equivalent to traveling in real world. Crawlers are resource-constrained. Therefore, most crawlers only discover static content, i.e., content generated immediately after downloading files from the server.

Scripts for Generating Website Content

Scripts are used to make a website user-friendly while reducing coding overhead. Some common use cases for scripting are:

Generating Website Content on the Client Side

There are several ways to generate website content on the client side, the most popular being Javascript. Website content generated by client-side scripting should be limited to content where crawling by search engines is not important.

Static website using HTML, CSS and Javascript. Client-side scripts are executed after delivering the files from the server to the end user. The content generated by client-side scripts are not discoverable by resource-constrained crawlers.
Static website using HTML, CSS and Javascript. Client-side scripts are executed after delivering the files from the server to the end user. The content generated by client-side scripts are not discoverable by resource-constrained crawlers.

Generating Website Content on the Server Side

There are several ways to generate website content on the server side, the most popular being PHP. Website content generated by server-side scripts are discoverable by crawlers. Server-side scripting is also useful in personlizing content for individual users.

Dynamic website using HTML, CSS, Javascript and PHP. Server-side scripts are executed on the servers before the website is delivered to the end user. The content generated by server-side scripts are discoverable by resource-constrained crawlers.
Dynamic website using HTML, CSS, Javascript and PHP. Server-side scripts are executed on the servers before the website is delivered to the end user. The content generated by server-side scripts are discoverable by resource-constrained crawlers.

Considerations for Search Engine Discovery: Crawler-Friendly Websites

Depending on scripting mode for generating website content, one may need additional steps to make the content on their website discoverable by search engines. For client-side scripting, pre-rendering is required for discovery by crawlers. Scripting is also be used for efficient delivery of content, for e.g. lazy loading of image, and DASH video streaming. Though this prevents media discovery by crawlers; one can make them discoverable as follows:

Crawlers Overhead on Bandwidth

Crawlers increase load on servers. Therefore, depending on how resource-intensive a website is, one may need to trade-off between discoverability by crawlers and bandwidth conservation. There are several ways to reduce bandwidth overhead by crawlers:

Author

Anurag Gupta is an M.S. graduate in Electrical and Computer Engineering from Cornell University. He also holds an M.Tech degree in Systems and Control Engineering and a B.Tech degree in Electrical Engineering from the Indian Institute of Technology, Bombay.