topical media & game development
object-oriented programming
subsections:
No approach to software development is likely to survive unless it solves
some of the real problems encountered in software engineering practice.
In this section we will examine how the
object-oriented approach is
related to the conceptions of the life-cycle of software
and what factors may motivate
the adoption of an
object-oriented approach to
software development.
Despite some variations in terminology,
there is a generally agreed-on
conception of the various phases in the development of a software product.
Roughly, a distinction can be made between a phase of
analysis,
which aims at specifying the requirements a product must meet,
a phase of
design, which must result in a conceptual view of the
architecture of the intended system, and a phase of
implementation,
covering coding, testing and, to some extent, also maintenance
activities.
See slide
[1-lifecycle].
No such consensus exists with respect to the exact relation between
these phases.
More specifically, there is a considerable variation in methods and guidelines
describing how to make the transition from one phase to another.
Another important issue is to determine what the products are exactly,
in terms of software and documentation, that must result from each phase.
The software life-cycle
- Analysis -- Conceptual Model, System Requirements
- Design -- System Design, Detailed Design
- Implementation -- Coding, Testing
With an increase in the number of software products
not satisfying user needs, prototyping
has become quite popular!
slide: The software life-cycle
The traditional conception of the software life-cycle is
known as the waterfall model, which prescribes a strictly sequential transition between
the successive phases, possibly in an iterative manner.
Strict regulations with respect to validation of the products
resulting from each phase may be imposed to avoid the risk of backtracking.
Such a rigid approach, however, may cause severe problems,
since it does not easily allow for modifying decisions taken earlier.
One important problem in this respect is that the needs of the users of a system
may change over time, invalidating the requirements laid down in an earlier
phase.
To some extent this problem may be avoided by better techniques
of evoking the user requirements in the analysis phase,
for instance by developing a prototype.
Unfortunately, the problem of accommodating changing user needs
and adapting to changing circumstances (such as hardware)
seems to be of a more persistent nature,
which provides good reason to look at alternative software development models.
Software development models
The software engineering literature abounds
with descriptions of failing software projects and remedies proposed
to solve the problem of software not meeting user expectations.
User expectations may be succinctly characterized by the RAMP
requirements listed in
slide [1-requirements].
Reliability, adaptability, maintainability
and performance are not unreasonable demands in themselves.
However, opinions on how to satisfy these criteria
clearly diverge.
Requirements -- user needs are constantly evolving
- Reliability -- incremental development, reuse, synthesis
- Adaptability -- evolutionary prototyping
- Maintainability -- incremental development, synthesis
- Performance -- incremental development, reuse
slide: Requirements -- RAMP
[Bersoff91] and [Davis88] explain
how the choice of a particular
software development model may influence the
chances of successfully completing a software project.
As already mentioned, rapid throwaway prototyping
may help to evoke user needs at an early stage,
but does not help much in adapting to
evolving user requirements.
A better solution in this respect is to adopt a
method of evolutionary prototyping.
Dependent on the technology used, however,
this may cause severe problems in
maintaining the integrity and robustness
of the system.
Less flexible but more reliable is an approach of
incremental development,
which proceeds by realizing those
parts of a system for which the user requirements
can be clearly specified.
Another means of adapting to changing user
requirements is to use a technique of
automated software synthesis.
However, such an approach works only if
the user requirements can be formalized easily.
This is not always very likely, unless the
application domain is sufficiently restricted.
A similar constraint adheres to the reuse of software.
Only in familiar application domains is it
possible to anticipate how user requirements
may change and how to adapt the system appropriately.
Nevertheless, the reuse of software seems
a very promising technique with which to reduce the cost and time
involved in software products
without (in principle) sacrificing reliability
and performance.
See slide [1-development].
Software development models
- rapid throwaway prototyping -- quick and dirty
- incremental development -- slowly evolving
- evolutionary prototyping -- evolving requirements
- reusable software -- reduces cost and time
- automated software synthesis -- one level of abstraction higher
slide: Software development models
Two of the early advocates of
object-oriented technology, Cox and Meyer,
regard the reuse of software as the ultimate
solution to the software crisis.
However, the true solution is in my
opinion not so straightforward.
One problem is that tools and technologies
are needed to store and retrieve reusable
components.
That simple solutions do not suffice is illustrated
by an anecdote reported by Alan Kay
telling how difficult it was to find his way
in the Smalltalk class structure after
a significant change, despite the browsing
facilities offered by the Smalltalk system.
Another problem lies in the area of human factors.
The incentives for programmer productivity have too long
been directed at the number of lines of code
to make software reuse attractive.
This attitude is also encouraged in universities.
Moreover, the reuse of other students' work is
usually (not unjustifiably) punished instead of encouraged.
However, having a sufficiently
large store of reusable software at our disposal
will allow us to build software
meeting the RAMP requirements stated above,
only if we have arrived at sufficiently
stable abstractions of the application domain.
In the following, we will explore how object-oriented
technology is motivated by problems
occurring in the respective phases of the
software life-cycle and how it contributes
to solving these problems.
Analysis
In academic environments software often seems to grow,
without a clear plan or explicit intention of
fulfilling some need or purpose,
except perhaps as a vehicle for research.
In contrast, industrial and business software
projects are usually undertaken
to meet some explicit goal or to satisfy some need.
One of the main problems in such situations,
from the point of view of the developers of the software,
is to extract the needs from the future users of the system
and later to negotiate the solutions proposed by the team.
The problem is primarily a problem of communication,
of bridging the gap between two worlds,
the world of domain expertise on the one hand
and that of expertise in the craft of software
development on the other.
In a number of publications
(Coad and Yourdon, 1991a; Wirfs-Brock et al., 1990; and Meyer, 1988)
object-oriented analysis has been proposed as
providing a solution to this problem of communication.
According to [CY90], object-oriented techniques
allow us to capture the system requirements in a model
that directly corresponds with a conceptual
model of the problem domain.
See slide [1-analysis].
%%
Object-Oriented Analysis
- analysis = extracting the needs
The problem domain -- complex reality
Communication -- with domain experts
Continual change -- user requirements
Reuse -- of analysis results
slide: Object-oriented analysis
Another claim made by proponents of OOP
is that an object-oriented approach enables
a more seamless transition between the respective phases
of the software life-cycle.
If this claim is really met, this would mean
that changing user requirements could be more easily
discussed in terms of the consequences of these changes
for the system,
and if accepted could in principle be more easily
propagated to the successive phases of development.
One of the basic ideas underlying object-oriented
analysis is that the abstractions arrived at
in developing a conceptual model of the problem domain
will remain stable over time.
Hence, rather than focusing on specific functional
requirements, attention should be given to
modeling the problem domain by means of high level abstractions.
Due to the stability of these abstractions, the results
of analysis are likely candidates for reuse.
The reality to be modeled in analysis is usually very complex.
[CY90] mention a number of
principles or mechanisms with which to manage complexity.
These show a great similarity to the abstraction
mechanisms mentioned earlier.
Personally, I do not feel entirely comfortable
with the characterization of the analysis phase
given by [CY90],
since to my mind
user needs and system requirements
are perhaps more conveniently phrased
in terms of functionality and constraints
than in terms of a model that may
simultaneously act as an architectural sketch of
the system that is to be developed.
However, I do agree with [CY90], and others,
that the products of analysis,
that is the documents describing user needs
and system requirements,
should as far as possible provide a conceptual model
of the domain to which these needs and
requirements are related.
Actually, I do consider the blurring of the distinction
between analysis and design,
and as we will see later, between design and implementation,
as one of the attractive features of an object-oriented
approach.
Analysis methods
The phases of analysis and design
differ primarily in orientation:
during analysis the focus is on aspects of the problem
domain and the goal is to arrive at a description
of that domain to which the user and system requirements
can be related.
On the other hand, the design phase must result
in an architectural model of the system,
for which we can demonstrate that it fulfills the user
needs and the additional requirements expressed
as the result of analysis.
Analysis methods
- Functional Decomposition = Functions + Interfaces
- Data Flow Approach = Data Flow + Bubbles
- Information Modeling = Entities + Attributes + Relationships
- Object-Oriented = Objects + Inheritance + Message passing
slide: Analysis methods
[CY90] discuss a number of methods
that are commonly used in analysis
(see slide [1-methods]).
The choice of a particular method will often depend
upon circumstances of a more sociological nature.
For instance, the experience of a team
with a particular method is often a crucial
factor for success.
For this reason, perhaps, an eclectic method
combining the various approaches may
be preferable
(see, for instance, Rumbaugh {\it et al.}, 1991).
However, it is doubtful whether such an approach
will have the same benefits as a purely
object-oriented approach.
See also section [methods].
I will briefly characterize the various methods
mentioned by [CY90]. For a more extensive
description and evaluation the reader is referred
to, for example, [Jones90].
The method of Functional Decomposition
aims at characterizing the steps that must be taken to
reach a particular goal.
These steps may be represented by functions
that may take arguments in order to deal
with data that is shared between the
successive steps of the computation.
In general, one can say that this method
is not very good for data hiding.
Another problem is that non-expert users
may not be familiar with viewing their problem
in terms of computation steps.
Also, the method does not result in descriptions
that are easily amenable to change.
The method indicated as the Data Flow Approach
aims at depicting the information flow
in a particular domain by means of arrows that
represent data and bubbles that represent
processes acting on these data.
Information Modeling
is a method that has become popular primarily
for developing information systems and applications
involving databases.
As a method, it aims at modeling the application domain
in terms of entities, that may have attributes,
and relations between entities.
An {\em object-oriented} approach to analysis
is very similar in nature to the information modeling
approach, at least with respect to its aim
of developing a conceptual model of the application domain.
However, in terms of their means, both methods differ
significantly.
The most important distinction between objects,
in the sense of OOP,
and entities, as used in information modeling,
to my mind lies in the capacity of objects
to embody actual behavior, whereas entities
are of a more passive nature.
Concluding this brief exploration of the analysis phase,
I think we may safely set as the goal
for every method of analysis
to aim at stable abstractions,
that is a conceptual model which is robust
with respect to evolving user requirements.
Also, we may state a preference for methods
which result in models that have a close
correspondence to the concepts and notions
used by the experts operating in the application
domain.
With respect to notation UML (the Unified Modeling Language,
see Appendix [UML]) is the obvious choice.
How to apply UML in the various phases of object-oriented software construction
is an altogether different matter.
Design
In an object-oriented approach,
the distinction between analysis
and design
is primarily one of emphasis;
emphasis on modeling the reality of
the problem domain versus
emphasis on providing an architectural
model of a system that lends itself
to implementation.
One of the attractive features of such
an approach is the opportunity of a seamless
transition between the respective phases
of the software product in development.
The classical waterfall model can no
longer be considered as appropriate
for such an approach.
An alternative model, the
fountain model, is proposed by [Hend92].
This model allows for a more autonomous
development of software components,
within the constraints of a unifying framework.
The end goal of such a development process
may be viewed as a repository of reusable components.
A similar viewpoint has originally been
proposed by [Cox86] and [Meyer88].
Object-Oriented Design
- design for maintenance and reuse!
Software quality
- correctness, robustness, extensibility, compatibility
Design projects
slide: Object-oriented design
In examining the primary goals of design,
[Meyer88] distinguishes between reusability,
quality and ease of maintenance.
Naturally, reusable software presupposes quality,
hence both quality and maintainability
are important design goals.
See slide [1-design].
In [Meyer88] a rough estimate is given of the shift in effort
between the phases of the software life-cycle,
brought about by an object-oriented approach.
Essentially, these figures show an increase
in the effort needed for design.
This is an immediate consequence
of the observation that the development
of reusable code is intrinsically more difficult.
To my mind, there is yet another reason for the extra
effort involved in design.
In practice it appears to be difficult and time
consuming to arrive at the appropriate
abstract data types for a given application.
The implementation of these structures,
on the other hand, is usually straightforward.
This is another indication that the unit of
reuse should perhaps not be small pieces of code, but
rather (the design of) components that fit into
a larger framework.
From the perspective of software quality and maintenance,
these mechanisms of encapsulation
and inheritance may be characterized as powerful means
to control the complexity of the code needed to realize a system.
In [Meyer88] it is estimated that maintenance accounts for
of the actual cost of software.
Moreover, adaptive maintenance,
which is the adaptation to changing requirements,
accounts for a disproportionately large part
of the cost.
Of primary importance for maintenance,
in the sense of the correction of errors,
is the principle of locality
supported by encapsulation, data abstraction and hiding.
In contrast, inheritance is a feature that may
interfere with maintenance, since it often breaks
down the protection offered by encapsulation.
However, to cope with changing requirements,
inheritance provides both a convenient and relatively
safe mechanism.
Design assignments
Actually designing systems is a complex activity,
about which a lot can be said.
Nevertheless, to get a good feeling for what is involved
in designing a system it is best to gain some experience first.
In the remainder of this subsection, you will find the
descriptions of actual software engineering assignments.
The assignments have been given, in subsequent years, to groups consisting
of four or five CS2 students.
The groups had to accomplish the assignments in five weeks,
a total of 1000 man-hours.
That includes formulating the requirements, writing the design specification
and coding the implementation.
(For the first of the assignments, IDA, C++ was used with the
hush GUI library.
For the second, MASS, Java with Swing was used.)
In both cases we allowed for an iterative development cycle,
inspired by a Rapid Application Development (RAD) approach.
These assignments will be taken as a running example,
in the sense that most examples presented in the book
solve in one way or another the problems that may occur
when realizing the systems described in the assignments.
IDA
An Interior Design Assistant (IDA)
is a tool to support an interior design architect.
When designing the interior of a house or building,
the architect proceeds from the spatial layout
and a list of furniture items.
IDA must allow for
placing furniture in a room.
It will check for constraints. For example
placing a chair upon a table will be prohibited.
For each design, IDA must be able to
give information with respect to pricing
and the time it takes to have the furniture
items delivered.
In addition to the design facilities, IDA must also
offer a showroom mode, in which
the various designs can be inspected and compared
with respect to price and delivery time.
slide: IDA
MASS
An Agenda Support System assists the user in maintaining a record of important events, dates and appointments. It
moreover offers the user various ways of inspecting his or her agenda, by giving an overview of important dates, an
indication of important dates on a calendar, and (more advanced) timely notification.
A Multi-user Agenda Support System extends a simple Agenda Support System by providing facilities for scheduling a
meeting, taking into account various constraints imposed by the agendas of the participants, as for example a special
event for which a participant already has an entry in his or her agenda.
A minimal Multi-user Agenda Support System must provide facilities for registering important dates for an arbitrary
number of users. It must, moreover, be able to give an overview of important dates for any individual user, and it must be
possible to schedule a meeting between an arbitrary subset of users that satisfies the time-constraints for each individual
in that particular group.
This minimal specification may be extended with input facilities, gadgets for presenting overviews and the possibility of
adding additional constraints. Nevertheless, as a piece of advice, when developing a Multi-user Agenda Support System, follow
the KISS principle: Keep It Simple ...
slide: MASS
Implementation
In principle, the phase of implementation
follows on from the design phase.
In practice, however,
the products of design may often
only be regarded as
providing a post hoc justification
of the actual system.
As noted, for instance, in [HOB87],
an object-oriented approach may blur
the distinction between design and implementation,
even to the extent of reversing their actual order.
The most important distinction between design
and implementation is hence the level of abstraction
at which the structure of the system is described.
Design is meant to clarify the conceptual
structure of a system,
whereas the implementation must include all
the details needed for the system to run.
Whatever approach is followed, in the end
the design must serve both as a justification
and clarification of the actual implementation.
Design is of particular importance in projects
that require long-term maintenance.
Correcting errors or adapting the functionality of
the system on the basis of code alone
is not likely to succeed.
What may help, though, are tools that extract
explanatory information from the code.
Testing and maintenance
Errors may (and will) occur during the implementation
as well as later when the system is in operation.
Apart from the correction of errors,
other maintenance activities may be required,
as we have seen previously.
In [Knuth92], an amusing account is given of the errors
Knuth detected in the TeX program over a period of time.
These errors range from trivial typos to
errors on an algorithmic level.
See slide [1-errors].
Errors, bugs
TeX
- [A] -- algorithm awry
- [B] -- blunder
- [C] -- structure debacle
- [F] -- forgotten function
- [L] -- language liability
- [M] -- mismatch between modules
- [R] -- reinforcement of robustness
- [S] -- surprises
- [T] -- a trivial typo
slide: TeX errors and bugs
An interesting and important question is to what extent
an object-oriented approach, and more
specifically an object-oriented implementation
language, is of help
in avoiding and correcting such errors.
The reader is encouraged to make a first guess,
and to verify that guess later.
As an interesting aside, the TeX system has
been implemented in a language system called Web.
The Web system allows one to merge code and explanatory text
in a single document, and to process that document
as either code or text.
In itself, this has nothing to do with object orientation,
but the technique of documentation supported
by the Web system is also suitable
for object-oriented programs.
We may note that the javadoc tool
realizes some of the goals set for the Web system, for Java.
Object-oriented language support
Operationally, encapsulation
and inheritance are considered to be the basic mechanisms
underlying the object-oriented approach.
These mechanisms have been realized in a number
of languages.
(See slide [1-languages].
See also chapter 5 for a more complete overview.)
Historically, Smalltalk is often considered to be the
most important object-oriented language.
It has served as an implementation vehicle
for a variety of applications (see, for instance, Pope, 1991).
No doubt, Smalltalk has contributed greatly to the
initial popularity
of the object-oriented approach,
yet its role is being taken over by C++ and Java, which jointly
have the largest community of users.
Smalltalk is a purely object-oriented language,
which means that every entity, including
integers, expressions and classes, is
regarded as an object.
The popularity of the Smalltalk language may be attributed
partly to the Smalltalk environment,
which allows the user to inspect the properties of all
the objects in the system
and which, moreover, contains a large
collection of reusable classes.
Together with the environment, Smalltalk
provides excellent support for fast prototyping.
The language Eiffel, described by [Meyer88],
may also be considered as a pure object-oriented language,
pure in the sense that it provides classes
and inheritance as the main device with which to
structure a program.
The major contribution of Eiffel is
its support for correctness constructs.
These include the possibility to specify
pre- and post-conditions for methods, as well
as to specify a class invariant, that may
be checked before and after each method invocation.
The Eiffel system comes with a number of libraries,
including libraries for graphics and window support,
and a collection of tools for browsing and the extraction
of documentation.
The C++ language (Stroustrup, 1991) has a somewhat different history.
It was originally developed as an extension of
C with classes.
A primary design goal of C++ has been to develop
a powerful but efficient language.
In contrast to Smalltalk and Eiffel,
C++ is not a pure object-oriented language;
it is a hybrid language in the sense that it allows us
to use functions in C-style as well as object-oriented
constructs involving classes and inheritance.
Smalltalk -- a radical change in programming
Eiffel -- a language with assertions
C++ -- is much more than a better C
- the benefits of efficiency
Java -- the dial-tone of the Internet
DLP -- introduces logic into object orientation
- development of knowledge-based systems
slide: Object-oriented languages
The newest, and perhaps most important, object-oriented
language around is Java,
which owes its popularity partly to its tight connection
with the Internet.
Java comes with a virtual machine that allows for running
Java programs (applets) in a browser,
in a so-called sandbox, which protects the user
from possibly malicious programs.
As the final language in this brief overview,
I wish to mention the distributed logic programming
language DLP (see Eliëns, 1992).
The DLP language combines logic programming
with object-oriented features and parallelism.
I mention it, partly because the development
of this language was my first involvement with
OOP.
And further, because it demonstrates
that other paradigms of programming,
in particular logic programming,
may be fruitfully combined with OOP.
The language DLP provides a high level vehicle
for modeling knowledge-based systems
in an object-oriented way.
A more extensive introduction to the
Smalltalk, Eiffel, C++, Java and DLP languages
is given in the appendix.
(C) Æliens
04/09/2009
You may not copy or print any of this material without explicit permission of the author or the publisher.
In case of other copyright issues, contact the author.