Hosting your website

Web pages may be retrieved from a local computer or from a remote web server. The web server may restrict access only to a private network, e.g. a corporate intranet, or it may publish pages on the World Wide Web.
Web pages are requested and served from web servers using Hypertext Transfer Protocol (HTTP).  Web pages can be viewed or, otherwise accessed from a range of computer based and Internet enabled devices of various sizes, including desktop computers, laptop computers, PDAs and cell phones.


A website is hosted on a computer system known as a web server, also called an HTTP server, that retrieves and delivers the Web pages in response to requests from the users.

In this order you must know very important details about this argument

If the construction of your web site is complete your pages still reside on your computer, not on the internet. If you log on to the Internet, however, you should be able to test all your links to external web sites. 

You are now ready to publish your pages on the World Wide Web. To do this, you will need to copy your pages and all graphic and music files which your pages use from your computer's hard drive to a web server hosting.
The Internet service provider (ISP) that you use for Internet access might provide free web hosting to its customers.
If not, there are a number of companies that will provide server space at no charge. You will, however, be required to include a banner or icon somewhere on the page to promote the free hosting service or their sponsors.  

In our website you will find a variety of informations to help finding through our reviews the best free web hosting services.

Although there are disadvantages to using free web hosting services, such services do provide an opportunity  to learn and develop basic web publishing skills.  Alternatively, there are many professional web hosting companies that offer inexpensive hosting packages with advanced features for under $10 per month.
When you register for web hosting services, you will be sent instructions on how to upload files to the web server.
The web host will also provide you with a user ID and password so that other people can't access or alter your files.

Upload Web Page
Copying files from your hard drive to the server is a simple process. The host site will prompt you for the name of the directory on your hard drive where your files are stored and the names of the specific files to be uploaded. To avoid confusion, make certain that all files are saved on the server using  the same file names that were used on your hard drive. The only software that you require to upload files is a web browser such as Netscape or Internet Explorer. Alternatively, free FTP software (File Transfer Protocol) can be used.

Once you have uploaded all your files, you should test your page on the web server and make certain that it functions properly and that all files have uploaded correctly. It is also a good idea to test your page using a different computer to ensure that graphic files are being read from the server and not from your hard drive. 
Images, sound or video files are stored on the web server as separate files, but again HTTP allows for the fact that once a web page is downloaded to a browser, it is quite likely that related files such as images and stylesheets will be requested as it is processed. An HTTP  web server will maintain a connection with the browser until all related resources have been requested and provided. Browsers usually render images along with the text and other material on the displayed web page.

Common features
Web server programs differ in detail, they all share some basic common features.

Every Web server program operates by accepting HTTP requests from the network, and providing an HTTP response to the requester. The HTTP response typically consists of an HTML document, but can also be a raw text file, an image, or some other type of document; if something bad is found in client request or while trying to serve the request, a Web server has to send an error response which may include some custom HTML or text messages to better explain the problem to end users.
Logging: usually Web servers have also the capability of logging some detailed information, about client requests and server responses, to log files; this allows the Webmaster to collect statistics by running log analyzers on log files.
In practice many Web servers implement the following features too.

Configurability of available features by configuration files or even by an external user interface
Authentication, optional authorization request (request of user name and password) before allowing access to some or all kind of resources.
Handling of not only static content (file content recorded in server's files system(s)) but of dynamic content too by supporting one or more related interfaces (SSI, CGI, SCGI, FastCGI, JSP, PHP, ASP, ASP .NET, Server API such as NSAPI, ISAPI, etc.).
Module support, in order to allow  the  extension of server capabilities  by  adding or modifying software  modules which are linked to the server software or that are  dynamically loaded (on demand) by  the core server.
HTTPS support (by SSL or TLS) in  order to allow secure (encrypted)  connections to the server on the  standard port 443 instead of usual port 80.
Content compression (i.e. by gzip encoding)  to reduce the size of the responses  (to lower bandwidth usage, etc.).
Virtual Host to serve many web sites  using one IP address.
Large file support to be able to serve files  whose size is greater than 2 GB on 32 bit OS.
Bandwidth throttling to limit the speed of  responses in order to not saturate the network  and to be able to serve more clients.

Path translation
Web servers usually translate the path component of a Uniform Resource Locator (URL) into a local file system resource. The URL path specified by the client is relative to the Web server's root directory.

Origin of returned content
The origin of the content sent by server is called:
static if it comes from an existing file lying on a file system;
dynamic if it is dynamically generated by some other program or script or API called by the Web server.
Serving static content is usually much faster (from 2 to 100 times) than serving dynamic content, especially if the latter involves data pulled from a database.

Web servers (programs) are supposed to serve requests quickly from more than one TCP/IP connection at a time.
Main key performance parameters (measured under a varying load of clients and requests per client), are:
number of requests per second (depending on the type of request, etc.);
latency time in milliseconds for each new connection or request;
throughput in bytes per second (depending on file size, cached or non cached content, available network bandwidth, etc.).
Above three parameters vary noticeably depending on the number of active connections, so a fourth parameter is the concurrency level supported by a Web server under a specific configuration.
Last but not least, the specific server model used to implement a Web server program can bias the performance and scalability level that can be reached.

Load limits
A web server (program) has defined load limits, because it can handle only a limited number of concurrent client connections (usually between 2 and 60,000, by default between 500 and 1,000) per IP address (and IP port) and
it can serve only a certain maximum number of requests per second depending on:
its own settings;
the HTTP request type;
content origin (static or dynamic);
the fact that the served content is or is not cached;
the hardware and software limits of the OS where it is working.
When a web server is near to or over its limits, it becomes overloaded and thus unresponsive.

Overload causes
At any time Web servers can be overloaded because of:
too much legitimate Web traffic (i.e. thousands or even millions of clients hitting the Web site in a short interval of time);
DDoS (Distributed Denial of Service) attacks;
Computer worms that sometimes cause abnormal traffic because of millions of infected computers (not coordinated among them);
Internet web robots traffic not filtered / limited on large web sites with very few resources (bandwidth, etc.);
Internet (network) slowdowns, so that client requests are served more slowly and the number of connections increases so much that server limits are reached;
Web servers (computers) partial unavailability, this can happen because of required / urgent maintenance or upgrade, HW or SW failures, back-end (i.e. DB) failures, etc.; in these cases the remaining web servers get too much traffic and of course they become overloaded.


Site Map    Tetraso.com
This website uses cookies to ensure you get the best experience on our website. More info