Technologies for developing effective systems
1 / 24


  • Uploaded on
  • Presentation posted in: General


I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


  • Prepared by;

  • Vanessa Miranda

  • AnnalizaCanaria

  • AlbryanCanosa


Software engineering was spurred by the so-called software crisis of the 1960s, 1970s, and 1980s, which identified many of the problems of software development. Many software projects ran over budget and schedule. Some projects caused property damage. A few projects caused loss of life. The software crisis was originally defined in terms of productivity, but evolved to emphasize quality. Some used the term software crisis to refer to their inability to hire enough qualified programmers.

Peter G. Neumann has kept a contemporary list of software problems and disasters. The software crisis has been slowly fizzling out, because it is unrealistic to remain in crisis mode for more than 20 years. SEs are accepting that the problems of SE are truly difficult and only hard workover many decades can solve them.


1. Structured Development

This was unquestionably the single most important advance prior to the ’80s. It provided the first truly systematic approach to software development. When combined with the 3GLs of the ’60s it enabled huge improvements in productivity.

SD had an interesting side effect that was not really noticed at the time. Applications were more reliable. It wasn’t noticed because software was being used much more widely, so it had much higher visibility to non-software people. It still had a lot of defects, and those users still regarded software as unreliable. In fact, though, reliability improved from 150 defects/KLOC in the early ’60s to about 15 defects/KLOC by 1980.

SD was actually an umbrella term that covered a wide variety of software construction approaches. Nonetheless, they usually shared certain characteristics:

  • Graphical representation. Each of these fledgling methodologies had some form of graphical notation. The underlying principle was simply that a picture is worth a thousand words.

  • Functional isolation. The basic idea was that programs were composed of large numbers of algorithms of varying complexities that played together to solve a given problem. The notion of interacting algorithms was actually a pretty seminal one that arrived just as programs started to become too large for one person to handle in a reasonable time. Functional isolation formalized this idea in things like reusable function libraries, subsystems, and application layers.

  • Application programming interfaces (API). When isolating functionality, it still has to be accessed somehow. This led to the notion of an invariant interface to the functionality that enabled all clients to access it in the same way while enabling the implementation of that functionality to be modified without the clients knowing about the changes.

  • Programming by contract. This was a logical extension of APIs. The API itself became a contract between a service and its clients. The problem with earlier forays into this idea is that the contract is really about the semantics of the service, but the API only defined the syntax for accessing that semantics. The notion only started to become a serious contract when languages began to incorporate things such as assertions about behavior as part of the program unit. Still, it was a reasonable start for a very good idea.

  • Top-down development. The original idea here was to start with high-level, abstract user requirements and gradually refine them into more specific requirements that became more detailed and more specifically related to the computing environment. Top-down development also happened to map very nicely into functional decomposition, which we’ll get to in a moment.

  • Emergence of analysis and design. SD identified development activities other than just writing 3GL code. Analysis was a sort of hybrid between requirements elicitation, analysis, and specification in the customer’s domain and high-level software design in the developer’s domain. Design introduced a formal step where the developer provided a graphical description of the detailed software structure before hitting the keyboard to write 3GL code.

2. Fourth generation languages

Fourth-generation languages attempt to make communicating with computers as much like the processes of thinking and talking to other people as possible. The problem is that the computer still only understands zeros and ones, so a compiler and interpreter must still convert the source code into the machine code that the computer can understand. Fourth-generation languages typically consist of English-like words and phrases. When they are implemented on microcomputers, some of these languages include graphic devices such as icons and onscreen push buttons for use during programming and when running the resulting application.

Many fourth-generation languages use Structured Query Language (SQL) as the basis for operations. SQL was developed at IBM to develop information stored in relational databases. Eventually, it was adopted by the American National Standards Institute (ANSI) and later by the International Standards Organization (ISO) as a means of managing structured, factual data. Many database companies offer an SQL-type database because purchasers of such databases seek to optimize their investments by buying open databases, i.e., those offering the greatest compatibility with other system. This means that the information systems are relatively independent of vendor, operating system, and computer platform.

Examples of fourth-generation languages include PROLOG, an artificial intelligence language that applies rules to data to arrive at solutions; and OCCAM and PARLOG, both parallel-processing languages. Newer languages may combine SQL and other high-level languages. IBM's Sonnet is being modified to use sound rather than visual images as a computer interface.

3. Software prototyping

Software prototyping, refers to the activity of creating prototype of software applications, i.e., incomplete versions of the software program being developed. It is an activity that can occur in software development and is comparable to prototyping as known from other fields, such as mechanical engineering or manufacturing.

The original purpose of a prototype is to allow users of the software to evaluate developers' proposals for the design of the eventual product by actually trying them out, rather than having to interpret and evaluate the design based on descriptions. Prototyping can also be used by end users to describe and prove requirements that developers have not considered, and that can be a key factor in the commercial relationship between developers and their clients. Interaction design in particular makes heavy use of prototyping with that goal.


The practice of prototyping is one of the points Fred Brooks makes in his 1975 book The Mythical Man-Month and his 10-year anniversary article No Silver Bullet.

An early example of large-scale software prototyping was the implementation of NYU's Ada/ED translator for the Ada programming language.

The process of prototyping involves the following steps

1. Identify basic requirements

2. Develop Initial Prototype

3. Review

4. Revise and Enhance the Prototype

4.Computer-aided software engineering (CASE)

Is the scientific application of a set of tools and methods to a softwaresystem which is meant to result in high-quality, defect-free, and maintainable software products. It also refers to methods for the development of information systems together with automated tools that can be used in the software development process.

The term "computer-aided software engineering" (CASE) can refer to the softwareused for the automated development of systems software, i.e., computer code. The CASE functions include analysis, design, and programming. CASE tools automate methods for designing, documenting, and producing structured computer code in the desired programming language.

5. Object Oriented development

Is an extension of structured programming: Object Oriented development emphasizes the benefits of modular and reusable computer code and modeling real-world objects, just as structured programming emphasizes the benefits of properly nested structures. Object Oriented programming is 95 percent philosophy and 5 percent technology; programmers trained to think in object technology terms can use existing procedural languages to do many of the tasks that were once thought to require C++ or Smalltalk. Object Oriented concepts can be broken down into four properties.

  • inheritance

  • encapsulation

  • polymorphism

  • abstraction

6. Client–server computing

Is a distributed computing model in which client applications request services from server processes. Clients and servers typically run on different computers interconnected by a computer network. Any use of the Internet (q.v.), such as information retrieval (q.v.) from the World Wide Web (q.v.), is an example of client–server computing. However, the term is generally applied to systems in which an organization runs programs with multiple components distributed among computers in a network. The concept is frequently associated with enterprise computing, which makes the computing resources of an organization available to every part of its operation.

The client's responsibility is usually to:

1. Handle the user interface.

2. Translate the user's request into the desired protocol.

3. Send the request to the server.

4. Wait for the server's response.

5. Translate the response into "human-readable" results.

6. Present the results to the user.

The server's functions include:

1. Listen for a client's query.

2. Process that query.

3. Return the results back to the client.

A typical client/server interaction goes like this:

1. The user runs client software to create a query.

2. The client connects to the server.

3. The client sends the query to the server.

4.The server analyzes the query.

5. The server computes the results of the query.

6. The server sends the results to the client.

7.The client presents the results to the user.

8. Repeat as necessary.

This client/server interaction is a lot like going to a French restaurant. At the restaurant, you (the user) are presented with a menu of choices by the waiter (the client). After making your selections, the waiter takes note of your choices, translates them into French, and presents them to the French chef (the server) in the kitchen. After the chef prepares your meal, the waiter returns with your diner (the results). Hopefully, the waiter returns with the items you selected, but not always; sometimes things get "lost in the translation."

System Integration

Is the bringing together of the component subsystemsinto one system and ensuring that the subsystems function together as a system. In information technology, systems integration is the process of linking together different computing systems and software applications physically or functionally to act as a coordinated whole.

The system integrator brings together discrete systems utilizing a variety of techniques such as computer networking, enterprise application integration, business process management or manual programming.

Enterprise resource planning (ERP)

Sytemsintegrate internal and external management informationacross an entire organization, embracing finance/accounting, manufacturing, sales and service, customer relationship management, etc. ERP systems automate this activity with an integrated softwareapplication. Their purpose is to facilitate the flow of information between all business functions inside the boundaries of the organization and manage the connections to outside stakeholders. ERP systems can run on a variety of computer hardware and network configurations, typically employing a database as a repository for information.

ERP (Enterprise Resource Planning) systems typically include the following characteristics:

  • An integrated system that operates in real time (or next to real time), without relying on periodic updates.

  • A common database, which supports all applications.

  • A consistent look and feel throughout each module.

  • Installation of the system without elaborate application/data integration by the Information Technology (IT) department.

    An ERP system selection methodology is a formal process for selecting an enterprise resource planning (ERP) system. Existing methodologies include:

  • SpecIT Independent Vendor Selection Management

  • Kuiper's funnel method

  • Dobrin's 3D decision support tool

  • Clarkson Potomac method

Middleware - is computer software components or people and their applications. The software consists of a set of services that allows multiple processes running on one or more machines to interact. This technology evolved to provide for interoperability in support of the move to coherent distributed architectures, which are most often used to support and simplify complex distributed applications. It includes web servers, application servers, and similar tools that support application development and delivery. Middleware is especially integral to modern information technology based on XML, SOAP, Web services, and service-oriented architecture.

Inter-Organizational system (IOS)

Is one which allows the flow of information to be automated between organizations in order to reach a desired supply-chain management system, which enables the development of competitive organizations. This supports forecasting client needs and the delivery of products and services. IOS helps to better manage buyer-supplier relationships by encompassing the full depths of tasks associated with business processes company-wide. In doing these activities, an organization is able to increase the productivity automatically; therefore, optimizing communication within all levels of an organization as well as between the organization and the supplier. For example, each t-shirt that is sold in a retail store is automatically communicated to the supplier who will, in turn, ship more t-shirts to the retailer.

Organizations might pursue an IOS for the following reasons:

  • Reduce the risk in the organization

  • Pursue economies of scale

  • Benefit from the exchange of technologies

  • Increase competitiveness

  • Overcome investment barriers

  • Encourage global communication

Application server

Can be either a software framework that provides a generalized approach to creating an application-server implementation, without regard to what application functions are, or the server portion of a specific implementation instance. In either case, the server's function is dedicated to the efficient execution of procedures (programs, routines, scripts) for supporting it's applied applications. This article is focused on software frameworks used for the development and deployment of application servers.

The term application server was originally used when discussing early client–server systems to differentiate servers that contain application logic SQL servicesand middleware servers as distinct from other types of data-servers.

Java is a programming language originally developed by James Gosling at Sun Microsystems (which has since merged into Oracle Corporation) and released in 1995 as a core component of Sun Microsystems' Java platform. The language derives much of its syntax from C and C++ but has a simpler object model and fewer low-level facilities. Java applications are typically compiled tobytecode (class file) that can run on any Java Virtual Machine (JVM) regardless of computer architecture. Java is a general-purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. It is intended to let application developers "write once, run anywhere", meaning that code that runs on one platform does not need to be recompiled to run on another. Java is currently one of the most popular programming languages in use, particularly for client-server web applications, with a reported 10 million users.

There were five primary goals in the creation of the Java language:

  • It should be "simple, object-oriented and familiar"

  • It should be "robust and secure"

  • It should be "architecture-neutral and portable"

  • It should execute with "high performance"

  • It should be "interpreted, threaded, and dynamic“

    Web service

    Is a method of communication between two electronic devices over the web. The W3C defines a "Web service" as "a software system designed to support interoperable machine-to-machine interaction over a network". It has an interface described in a machine-processable format (specifically Web Services Description Language, known by the acronym WSDL). Other systems interact with the Web service in a manner prescribed by its description using SOAP messages, typically conveyed using HTTP with an XML serialization in conjunction with other Web-related standards."

  • Login