Banner: ZumGuy Publications and Network

ZumGuy Publications and Network

Client and Server in Web 2.0

Posted by Andrew on Saturday, 16th February 2013 17:25

Client and Server in Web 2.0

 

What happens when we go onto the Internet? How does our browser communicate with the server to ensure we receive the pages we requested?

 

The process from browser request to server and back to the browser on the user’s computer screen is not complex once the general principle is understood. First some terminology so we know we are on the same page:

 

Client: that’s you. As far as the broader Internet is concerned it is your computer, or the

 

Browser you are running on it. The browser we like best is Mozilla Firefox, because of its amazing sets of developer tools and plug-ins, but Safari and Chrome are excellent choices as well.

 

Server: that’s the bit on the other end. These are computers not too far away operated 24/7 by a small team of geeks with coffee stains on their shirts. You are probably connected to the Internet by a broadband connection, which is a highspeed modem. You may even be part of a local area network, an Intranet, or other non-public network, but ultimately the principles are the same. The server is the host for your website. Every time you access a website anywhere in the world, it is one of the over 10 million servers (Google alone operates over 900,000 servers) which supplies you with the data you see on your screen. This happens through the:

 

Client side request: whether you send a search criteria to Google, or access your Facebook page, or go directly to a website, you are sending a ‘request’ to a server. Its actual route is controlled by devices such as routers, proxies, gateways, etc., which need not concern us right now. The request your browser sends is a precise set of instructions and information. To be sure any server and intermediary in the world can interpret this bunch of data correctly, it is formatted in a precise way. This was the brainchild of Tim Berners-Lee. When he created it in 1991, it kickstarted the phenomenon which evolved rapidly into the:

 

World Wide Web – the structure which utilised the protocol created by Berners-Lee quickly replaced the previous system used on the Internet. This was, with the arrogance of hindsight, a clumsy procedure whereby individual computers would use a modem to timeshare an ordinary telephone line to ‘call up’ a specific computer. Applications were typically bulletin boards hosted by centres of information such as libraries and universities, who had dedicated lists of authorised users, such as students and teachers. The World Wide Web created a system hosting webpages coded in a protocol, or standard, known as HTML (hyper text mark-up language) which still dominates the Internet today.

 

IP address – the key to the worldwide web was giving websites a special ‘telephone number’. This is still called the ‘Internet Protocol’ and is a unique identifier for an index page of a website, often called the homepage. Due to unforeseen demand, the IP address system is undergoing an expansion from the v4 standard to the v6, providing many more possible permutations of 15 numbers, to cope with the more than 200 million active website addresses out there.

 

Although the IPv6 (Internet protocol Version 6) has been standardised since 1998 (http://tools.ietf.org/html/rfc2460), it is still being implemented.

 

Website name – people find it easier to remember and use proper words or acronyms rather than strings of 12-15 numerals. As a result, the web developed a system of conversion of the IP number address to a registered associated name. What you see in the address bar, known as the URL (universal resource locator), is this name. Ipv4 was the original standard, and consisted of 32 bits. Due to the rapid growth of the Internet, there is a 128-bit version, called Ipv6 which is replacing the old system.

Why was this necessary? Simple maths:

32 bits reduces to 4 bytes of 8 bits. Each byte could hold a character of the 256-character ACSII code. If all ASCII characters are used to make up the address, which they aren't, there would be 28, or 256, permutations of 8 binary bits, 0 or 1. This would provide a maxiumum of 2564 (4,294,967,296) possible unique addresses. In effect the number of usable addresses is far lower because, for administrative reasons, the addresses are divided into smaller blocks allocated to RIRs (regional internet registries) - a bit like national telephone prefixes. The IPv6 system has 128 bits available, expanding the theoretical addresses available to 16 bytes, or 25616 (3.4 x 1038) permutations. Of course, the Internet Assigned Numbers Authority (IANA), the body charged with managing the internet address allocation system, does not anticipate this level of demand (something of the order of a billion, billion, billion domains per person alive, which even the most pessimistic among us might consider overkill), and is using the extra capacity for rationalising the local registry prefix system for better access.

 

For further reference we recommend the following sites:

 

http://www.internetsociety.org/

The Internet Society, ISOC, was founded at the birth of the World Wide Web. It is a non-profit organization which aims to promote standards, education and policies to ensure the open development, evolution and use of the Internet for the benefit of all people throughout the world. It looks a bit like the UNO, with offices in Washington DC (USA) and Geneva (Switzerland), and has 'chapters' scattered around the globe. Check your local one out and let us know what they are up to.

 

http://www.ietf.org/

The Internet Engineering Task Force

Their declared aim is to make the Internet work better. To this end, they produce policy guidance documents to aid people interested in the design, use and management of the Internet. This was also formed in the same year as the ISOC, in 1992.

 

Their website is a bit formal and bland, but makes good reading.

 

http://www.w3.org/

Founded in 1994 by Tim Berners-Lee, The World Wide Web Consortium (W3C) is an international community that develops open standards to ensure the long-term growth of the Web.

It runs a school and reference site for those learning to make websites and programme.

http://www.w3schools.com/

Posted by Sean on Tuesday, 5th March 2013 18:18

Great article!

A very good explanation of the basics of today's Internet - the World Wide Web.

You must be logged in to post messages.

Quote of the day...


ZumGuy Internet Promotions