Saturday, March 12, 2011

Dynamic Web Page Interface Languages

This is the first entry in the blogg version of lecture II in "web site languages". It is aimed at the non IT Manager as well as alma mater Strathclyde University MSc Marketing Students with no prior technical IT background. It is broken down in smaller bloggs for ease of reading and repurposing,


PHP is probably the leading dynamic language today

PHP has become probably the world leading language in terms of volume of web sites and transactions handled using this language. It is an open source language and engine so many developers cooperate in its development and it is available in free versions as a server engine. Modern browsers are optimised for PHP. As a code language it is one which is interpreted on the server by a "mother" programme, very much like the other languages discussed below, (asp, cfm).

Coding is both embedded in the html, in the URL constructs and in post-form data as well as in the files on the server side. It has evolved a diversity of really useful "routines" which you call up with a fairly simple pneumonic or english sounding programming command and are actioned by the server through its interpreter.

By in large, information from the client (you!) is requested in forms, buttons etc or refered to via cookies ( or if you are mid way through a transaction, the session file) and then inputted to programmed actions to then give planned results in information, web pages etc you get back. Thus what you actually see on the web page is often the result of quite a lot of dynamic computation, based on what the server had to work on in the second you visited or interacted. This can be in everything from building up a shopping cart full of goods and moving to check-out, to just the simpel dot com request for the home page.

Historical note on PHP
While Cold Fusion was ruling the cutting-egde-roost, PHP was the poor mans cousin: "Personal Home Pages" was a free shareware language which was used mainly in small web start ups and colleges. Microsoft then took the Cold Fustion crown with ASP, probably due to active-server-pages being better integrated to SQL Server language, which a lot of nerds were learning at university.

At that time, cold fusion programmers could charge a premium of even several thousand dollars a day! ASP was pretty quick to pick up if you had both HTML and SQL Server and Microsoft had the infrastructure to support wide spread training. Thus CF and .cfm lost the strangle hold they had on the market and macromedia faded into a buy out by Adobe. I actually moved a whole web agency over to asp from cold fusion because of both cost and the obstreporous nature of "cfm" programmers in 1999.

Meanwhile, the community of PHP developers were sneeking up and it wasn't long before the cross platform, open and cheap nature of using PHP and shareware MySQL erroded microsoft's heavier weight and contendably, restrictive APS_SQL Server structures. Being free and open source, it meant programmers could get mutual benefits in building upon the language itself, having peer review on the web for their prototype commands/routines, and then the whole language could evolve forward using the best types while rejecting the weaker ones.
I remember turning up at a fledgling web agency in 2000, having worked at an CF / ASP hot shop as a project manager, to see this home brew code in PHP and taking the whole thing as being amateur. How wrong we were all to be proven.

What Dynamic Languages Do and How

All these languages mentioned above, have a fairly common modus operandum ( but not perl script and some others very little used today) : when the server is configured to accept PHP requests and run PHP commands server side from those client requests, then input strings/ forms / requests go to the appropriate area on the server set up and allow the programme to execute a response, and most often send out information back the user on the internet. Most often this will execute a database query based upon an input string. Otherwise it could find content from a source which varies: a file location or third party web site: by just changing the one parameter, like the contents of the folder or the web address, you then serve up new information without needing to.

Information sent to and from the server can be text based, mediated in various string forms or XML, or it can be graphic or other "sub files" to be used on the web pages at the client end, or it can be array based where data is hidden on the client end and presented upon request.

These programmes like PHP and their eventual coding allow for several useful operations to be combined in one command, which has often some "english" comprehensibilyt eg $_GET is a way of handling the strings from GET

In php you can choose to use GET or POST when forms are submitted : post goes behind the scenes so to speak, while GET places the search string in the URL
You can look into the "under the bonnet" workings of many web sites if there is a GET command in the URL when you interact with the website. Usually this approach is submitting a query behind the file name

eg. www.xyz.com/page.asp?ring=3&type=30&date-01-01-2011:01.03.2011.....

The question mark denotes the GET , the first or primary element is defined and thereadter ampers AND '&' is used to both define the operators, the conditions and any bolean AND search terms which are either included by your typing in a word or as part of your order processing for example.

If you come across one such GET line in the URL then you can play around with building your own queries and submitting them to see what works and what comes back. As a web developer you can also of course then use this web site to deliver queried information via your own server ( but not on web pages due to the 'same origin policy' enfored in browsers, see 'security'). The only current alternative to scraping content or 'proxy' serving queries is to use RSS feeds from 3rd party web sites and then parse their content into the web page or create an on-the-fly array to query.

Another way of analysing dynamic web sites is to use "firebug" and right click on any dynamic element and "Inspect Element" which takes you to the code directly. This is invaluable because web code can take ages to read through and it is easy to miss a table element for example. This is one use of the term "screen scraping" in this case for dynamic code and not XML content.

> ie www.searchmonkey.com/search.php?q=monkeys : the apache gets the URL request and knows to both go to the search.php location and send the string "monkeys" to be processed by the php there.

The php programme must be installed to the servers network: CFM used to just be a disc or an online download for ISP web hosts. The primary end of the server network, usually apache, then sends requests for php pages from the browser to the appropriate file location.

PHP has some clumsy language and syntax according to many programmers, but it is relatively client-server light and easy to learn. PHP v 5.1 and 5.2 where stable and debugged largely in 2010 for all browsers and most server types.

There is a master command file, an initiate / config file usuallycalled " Php.ini" which sets up what elements and syntax will be allowed.

Despite the criticisms, PHP above all delivers reliable and scalable web sites. Programming resources are now widely available, with the ease of learning and share ware nature rendering dynamic web sites within the reach of many amateurs. It is very much a language that is here to stay and will continue to evolve with the browsers and server technology. So for the forseeable future we will see many web sites or at least pages, based around PHP or wholly dependent upon this dynamic language.

No comments:

Post a Comment