eXtropia: the open web technology company
Technology | Support | Tutorials | Development | About Us | Users | Contact Us
 ::   Tutorials
 ::   Presentations
Perl & CGI tutorials
 ::   Intro to Perl/CGI and HTML Forms
 ::   Intro to Windows Perl
 ::   Intro to Perl 5
 ::   Intro to Perl
 ::   Intro to Perl Taint mode
 ::   Sherlock Holmes and the Case of the Broken CGI Script
 ::   Writing COM Components in Perl

Java tutorials
 ::   Intro to Java
 ::   Cross Browser Java

Misc technical tutorials
 ::   Intro to The Web Application Development Environment
 ::   Introduction to XML
 ::   Intro to Web Design
 ::   Intro to Web Security
 ::   Databases for Web Developers
 ::   UNIX for Web Developers
 ::   Intro to Adobe Photoshop
 ::   Web Programming 101
 ::   Introduction to Microsoft DNA

Misc non-technical tutorials
 ::   Misc Technopreneurship Docs
 ::   What is a Webmaster?
 ::   What is the open source business model?
 ::   Technical writing
 ::   Small and mid-sized businesses on the Web

Offsite tutorials
 ::   ISAPI Perl Primer
 ::   Serving up web server basics
 ::   Introduction to Java (Parts 1 and 2) in Slovak


Introduction to Microsoft DNA
Client Server Model Applications  
Previous Page | Next Page | Table of Contents

As technology advanced, connecting machines and sharing data became an important goal and a pressing reality for application developers.

Simple networks formed and new applications and application architectures arose. Since networking and resource sharing introduced larger and more complex problems into the development environment, and because the inherent flaws in monolithic applications were becoming clear, a new approach that captured the nature of these new applications was devised. And because the "applications" had grown, so had the stages of an application from an abstract viewpoint.

Client-Server applications became all the rage. And the monolithic applications started to fade into the past like some forgotten dinosaurs.

In the client-server model, applications were broken apart, distributing processing between client computers and server computers.

As client-server applications became feasible, so too did the layering of the technology become more important. In the client server model, the three layers of an application could more easily be isolated. In fact, such isolation became even more crucial as scalability, distribution and mainenance became even more complex.

Another factor in the separation of the layers came from the data. As sharing data became essential to faster and wider information distribution, the network systems drove applications to evolve into data sharers.

Rather than store data locally, in a client-server application, data would be stored in a central repository where it could be accessed by multiple clients who wished to "share" it.

The benefit of this architecture was that it gave access to large numbers of users so that they could store and retrieve important data in a consistent and stable manner generally from a "fully loaded" application on the client machine. Order processing, accounts, internal systems; email and database applications became the norm in the client-server era.

The traditional client server applications enabled, and encouraged, developers to build feature rich solutions that integrated key technologies in a single point of access.

Typically a developer would focus on delivery of the graphical user interface and storing data in repositories that enabled users to share data. Technologies like ODBC (Open Database Connectivity), Visual Basic, Visual C and MFC (Microsoft Foundation Classes) helped developers build applications in short timescales that could access and share data.

Previous Page | Next Page | Table of Contents