Saturday, July 30, 2011

Overspecification

One of the key concepts in software engineering is the need to avoid over specification. It is natural that at some point in the decomposition of the specification of a system to be created that the specifier resorts to a description of how to do it instead of a statement of what must be done. When you drill down into this the problem is the difference between denotational semantics versus axiomatic semantics. In the first, the problem to be decomposed is given in what is hopefully the most abstract algorithmic way possible by the specifier. The problem is that it assumes there is only one algorithm possible for the solution and it locks the implementor into that algorithm.

The alternative is the axiomatic semantics of stating the pre- and post-conditions as well as any invariants in the required solution. This at once gives the implementer the choice of algorithm and implementation choices possible in the solution set. But at the same time it gives the implementer no direction as to how it can be achieved. Traditionally in commercial work the axiomatic method of specification has not been used merely because of the difficulty of making these statements about the required implementation. They are seen in formal methods but the difficulty of implementing formal methods is well known.


Friday, July 29, 2011

A worked case study

For yucks, I am going to work through a trivial design alternative in two directions to see how this progresses. The problem to be solved is a simple lookup where the user enters a term and the system responds with a definition or some other static text associated with the term. There are two alternative architectures to be explored. The first is a standard pc app type where everything would be loaded onto a personal computer. The second is a web app using a client server pattern. I am interested in seeing how the progressive elaboration of the client server pattern evolves as I drill down to the lower levels of design. I will be taking this much further than a typical architecture design since after specifying the client-server pattern, an architecture design would probably only go one level further down unless there were a quality sought that could not be assured at that level.

So the first cut is to recognize the client-server pattern. There is a client module, a server module and a relationship which is the internet. The next cut will be to provide specification to each of those pieces.

Thursday, July 28, 2011

flows

[Bass2003] calls out that views for data flows and flows of control are projections of already discussed flows. I think this is an important point to make in lecture. Many informal architectural diagrams are often data flow or control flow diagrams.

a linguistic approach to software architecture design

Discussions of software architecture depend upon the meaning of the words and symbols used by the people in the discussion. Like a human language, the tokens can be densely packed patterns or styles or they can be low-level primitives to show how a design can be refined. It seems to me that the same cognitive mechanisms are at work in both.

When I talk about a client-server system I conjure up an abstract concept just as if I had said chair. We recognize both by a set of characteristics that define the concept. Yet to be helpful both must be modified to assist with design. I may speak of a dining room chair or lounge chair. I may talk about a web server and client or one on a proprietary network that uses a protocol other than HTTP/HTML. The concept lends itself to restriction and extension. In language we usually do this with adjectives. As a class concept, this would be done with sub-classes. I saw some references to work like this that I must check out.

This is also highly consistent with the Lakoff/Johnson way of looking at these patterns as metaphors. We invoke the metaphor and then call out the ways in which they are extended into concrete forms or in which we define a new form by substituting one thing for another.

Problem Space and Design Space

In the design literature there are discussions of the problem space and the design space. I assume that a grad student at CSUS in software engineering or computer science will be new to these terms. Since the concepts bring forward several important aspects of the process of design they are worth discussion.

The problem space is that area of problem statement and analysis. After all, if this is to be a solution there must have been a problem (or opportunity) that justified its creation. At least for me, I like to imagine this problem space as a line with inflections that represent the various needs, constraints and desires of the problem to be solved. During analysis, the line will not be distinct, but rather fuzzy and broad with a great deal of imprecision regarding what is really needed. Ideally this line (or surface if you want to think of it as 2 dimensional) is completely drawn to a specific level of detail before design begins but that is often not so.

What the designer attempts to do is create the ideal product that will match the requirements of the problem space with a product with capabilities complementary to the needs. The fitness for purpose will be the reflected in the gaps between the problem space and the solution space.

Architecture Tools

The design of a viable software architecture for a system depends upon its fitness for use. Unless and until we have a model which can perform a quantitative evaluation of a given architecture against its stated use, the evaluation of a proposed software architecture must depend upon human processes that are not algorithmic and inherently non-deterministic. What I want to talk about is this schism between those tools used in the design and evaluation of a software architecture between those which have some hope of machine implementation versus those which are far less likely, or impossible, to ever fully capture in any program.

Software engineering academics have been exploring software metrics for several decades and have come up with various measures that seem to show promise of building a predictive theory for software. Measures for cohesion and coupling as well as the cyclomatic complexity are some. While these were originally envisioned for code level analysis, they have been used at higher levels of abstraction with some success. This is a well recognized area of research and one which will continue to develop. It must be within the scope of the education for a well-educated software architect.

But in contrast to these quantitative metrics, there is a great deal of attention paid to the methodologies that are to be used in the system's development life cycle as they relate to the creation of a software architecture. While there is a good case to be made that the study of these methodologies is more properly in the scope of management information science, I believe it is a mistake to completely separate the methodologies from the more quantitative aspects of software engineering. A multi-disciplinary approach is required since there must not be a bright line between software architecture and the project roles through which a software architect may rise. The development of a talented software architect must balance the "harder" engineering knowledge with the "softer" management knowledge to achieve the synergism that will result in the most capable worker.

Most software engineering education focuses on the lower level aspect of design, specifically at the level of the object or module. This is necessary since without an understanding of the very basic concepts of data structures, control structures, formal syntax and object-oriented methods, a true understanding of a software-oriented system is not possible. The inherent limits of the tools of software engineering is needed just as strength of materials is needed for a civil engineer or basic chemistry is needed for a chemical engineer. These are the hard stops that can be encountered and the engineer must understand them if successful designs are to be created.

However as the size and complexity of the software systems grow, the layers of abstraction must also grow if the resulting system is to remain comprehensible. The literature is rife with case studies of systems that were created by individuals over an extended period of time whose design is really only known by the creator and exists in no communicable artifact anywhere. With luck, there is always some underling who is prepared to fill the role of the designer if that person leaves the organization. This illustrates several different ways in which software architecture begins to separate from the lower levels of design.

First is to note that if you envision a system small enough that it is the result of a single person, that suggests that the person single-handedly design and implemented the system. There was no separation of roles between the designer and the builder. It should be self-evident that this organizational structure presents limits to the size and complexity of the system to be built and the time frames in which it can be built. Some exceptional people have build sophisticated systems in short periods of time but for the purpose of an engineering discipline we ignore these outliers since these feats tend to be difficult to repeat with any dependability. This is outside the realm of what we aspire to in the day-to-day world of engineering. We all hope that we will rise to these ranks but it should never be the expectation that a sole person create something so extraordinary.

Once the problem becomes so large that it is not reasonable for a single individual to create the system that is required, it is natural that separation of concerns and focus on skill sets begins to create distinct roles and that these roles are brought together into a team environment. Given the complexity of the modern programming languages and the tools needed to create them, the role of the coder was long ago established. This often provides entry level positions for the software engineer since the specification for a module can be very highly structures so as to leave relatively little latitude for the coder yet allowing him to learn the business architecture of his client and the existing architecture of the system under construction (or maintenance).

Defining the role of the coder immediately creates a new role; that of the person who writes the specification for the module to be coded. The practice once was that this responsibility fell to a business analyst. It became their responsibility to gather requirements and specify the modules that needed to be created.

Since there would often be many coders and often different levels of coders depending upon their capabilities, the project team would be sufficiently large that it required the talent of a team lead who would report to the client on management matter such as schedule and budget.

In many ways this naive team structure hasn't existing in exactly this form for a generation or more. It has been found to be wanting in the same way that the waterfall methodologies that were created in the formative stages of software engineering were inadequate to explain what was actually done as opposed to a helpful intellectual model for what was supposedly the ideal way. But before we explore the ways in which this was left behind, let's continue by looking at the legacy that these early methodologies left.

Much systems thought probably comes from the work in the military-industrial complex. These large, mission critical projects taken on by these large organizations inherited the culture from the military of how to take a very large effort and deconstruct it into a set of smaller tasks with the needed oversight and control. Whatever else is said about the waterfall model, it has been the reference for how a large work team should be organized. I'll assume for now that you understand that model and plow on.

As befits its origin, systems development methodologies reflected a mechanistic attitude toward the creation of a software system with all the inputs, processes and outputs neatly laid out in a graph reflecting the predecessor tasks and artifacts of each process and specified their output. If the inputs and processes are correct, the outputs will be sufficient for the successor tasks. The entire process would flow as smoothly as a well oiled machine.

The most basic assumption of a strictly enforced waterfall project plan is that everything that is needed to make important decisions is known at the conclusion of the requirements phase of the project. Details may need to be worked out but the ability to see the structure to be built is sufficient to allow for prediction of the cost and effort.

This assumption has been more wrong than right in practice. Successful projects seem to be a product of shrewd negotiators with enough experience to argue for sufficient resources in the absence of hard data to support it, to secure that funding from business managers and to manage the project by limiting the scope to the money and time available, not to some theoretical document that adequately articulate all of the requirements for the product to be built.

This cynical discussion is tangential to the main point I will make but important to provide the context in which software engineering takes place in the real world. To ignore the real world and embrace some model of perfectly logical business managers is about a realistic as an architect designing a building in which the wind will never exceed 10 mph; it is fantasy or art, not engineering. In engineering you substitute reality for desire and accept the limits whether they be logic or the results from social science.

The second big fallacy in the well developed waterfall methodologies of the past is that they assume that the future is like the past, that the system to be created is sufficiently like the other systems that all the tasks and artifacts can be predicted. Since many of the failures are attributable to failures in the requirements gathering phase, those errors are expensive to fix, if they can be fixed within the time and budget constraints. A sufficiently experienced team may know the needs of the client better than the client. In those cases, the project can be steered toward success even when the requirements gathering phase has technically been incomplete, inconsistent or incorrect. In many captive development shops, this has been the state for many years. The success may be attributed to the methodology but in reality the success is because of the staff, not the tool.

This suggests another reason why the waterfall methodology is flawed. Business managers who must make hire and fire decisions must provide the workers to staff the project. Yet their ability to assess the capabilities of the untested workers is limited. Hence, just because someone fills a particular role on a project team is no guarantee they will do their job well. Some of this can be addressed by a good quality assurance program but often the organizations with many new workers, business managers with little experience staffing projects and a project with tight time and money constraints are the same organizations that have poor quality assurance programs. Again, the historical roots in the military-industrial complex are not carried over into a commercial environment since the imperative of the mission critical project means something wholly different in a military context than it does in most commerce.

There have been two very different reactions to the failure of these waterfall methodologies. One reaction is to impose a quality assurance program. By its nature, a quality assurance program requires artifacts against which verification and validation can be performed. For very large projects, these artifacts are complex and expensive to produce. Yet without them no QA can be performed. The somewhat logical response of management to project failures was to improve the quality processes creating greater emphasis on the artifacts or adding to the artifacts that must be created in an attempt to detect project problems earlier and mount corrective action. It must be clear that this can become a self-reinforcing feedback loop. After a few cycles the systems development life-cycle becomes a bureaucratic morass of paperwork to be filled out and documents to be created which go to committees for review and approval before work can officially progress.

The frustration software engineers felt when caught in this kind of environment led to the Agile Manifesto which was a cri de coeur from the software engineers that saw the folly of this progression. Here is one version:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

Here is how I interpret this declaration and what it opposes. The opposition to processes (i.e. methodologies) is explicit. A methodology is something that exists to promote the proper interactions of the individuals on the team. But it had become a straight-jacket forcing developers to ignore their instincts and shutting down debate rather than supporting it. This is particularly attractive to the gen x crowd and the changing realities of software engineering decisions. The culture of top-down management and the unidirectional communications no longer made sense in such a complex development environment. People needed to work more cooperatively and exert greater thought and depend less on "tools" (whether they are programming languages which can lead to academic discussions among well educated software engineers that are beside the point, rubrics which were probably best adapted to technology from a prior generation at best, or project management systems that sought to measure and categorize every hour expended on the development) There was an intuitive understanding that people needed to talk to each other and to develop a sense of shared commitment for success.

The second point of the manifesto was a reaction to the exceedingly long lag times between project initiation and the delivery of a working product. Even in the best of circumstances, the business reality can fundamentally change in that period of time. Consumers, commerce and government wanted to be more nimble and to be able to respond to changes more quickly. While human processes can be changed relatively quickly, automated processes were proving very difficult to change. Once created software was not modifiable. Even while the project was ongoing, responding to a change request was often contentious and difficult to assimilate without impact to the budget and schedule.

What the Agile Manifesto implicitly required was a form of iterative development where the delivery of working software was accelerated even when the product was not necessarily something that completely solved the problem. One of the stated reasons is that since requirements gathering and documentation was so infallible, why do it at all? Why not create a prototype that would demonstrate what the developers believed was needed from conversations with the client and then demonstrate it? Clients respond more favorably, and more constructively, to a working prototype than they do to an abstract document that they are never sure they completely understand. The cycle of development can become much shorter, ideally measured in weeks, and the ultimate product developed by continually iterating, getting closer to the final product with each iteration.

The call for customer collaboration was a reaction to the inherent animosity that the unbridled waterfall methodology created between the client and the development organization. The model envisioned that the customer could collaborate on the creation of a requirements document that would act as a contract for development. Either explicitly or implicitly the client was asked to sign off on the requirements document. Inevitably developers would attempt to design the system against this document. Misunderstandings or errors of incompleteness, inaccuracy or inconsistency would eventually be found and the impact to the budget and schedule would lead to acrimonious discussions between the development organization and the client regarding the interpretation of the requirements document.

Agile wanted to sidestep this unhelpful dynamic by stressing the continuing role of the client throughout the development process. If they client could not, or would not, commit the proper resources to answer questions as they arose instead of depending upon a requirements document that was never complete enough, then the failure would become a major indicator that the project was already in trouble long before code was created. The sense of shared commitment and responsibility has always been needed for success projects. The Manifesto reminded everyone of it.

Systems developed under the waterfall methodologies were often brittle and unmodifiable. Attempts to create systems that were modifiable often led to very complex designs as the developers attempted to make as much as possible easier to change. Inevitably their attempts failed as the cost of this complex design was difficult to deliver at an acceptable price and there always seemed to be one more inflection point in the design that had not been handled.

What the Agile Manifesto stressed was the inevitability of change and the need of everyone involved in the development effort to remain flexible, expect that change will happen and respond to it with a client-centered acceptance instead of a reactionary and defensive posture.

At this point the clash between at least what SEI espouses for a development methodology and the Agile Manifesto is brought into the classroom. Students now are well indoctrinated into the Agile Manifesto and the need for iterative design. However the creation of an architecture for a large software product requires a fair amount of Big Analysis Up Front (BAUF) in order to perform the kind of decisions that are needed for the first few decompositions. How can this be resolved?

So methodologies are an important part of the toolbox for large-system creation. It is unlikely that there are right and wrong methodologies in any absolute sense but rather drivers of the specific methodology that should be adopted by a specific project for a specific organization for the creation of a specific product. It must be driven by the risk factors of the effort, the organization and relationships between the developing organization and the client, and the novelty of the product to be created.

Besides these human process tools, there are more engineering oriented tools which assist with the technical aspects of design. They include tools to help in the task of architecture reconstruction, architecture presentation and documentation, and architecture design.

Procedural Tools for Architectural Analysis
ATAM, CBAM [Bass2003], SAAM, an earlier version of ATAM [Kazman94], quantified design space [Jum90][Hou91]


Tools for architecture reconstruction/recovery
Dali [Bass2003], Sneed's reengineering workbench [Sneed98], the software renovationfactories of Verhoef and associates [Band97], rearchitecting tool suite by Philips Research [Krikhaar99], Rigi Standard Form [Mueller93][Wong94], [Bowman99] outlines a method similar to Dali for extracting architectural documentation from the code of an implemented system, Harris and associates outline a framework for architecture reconstruction using a combined bottom-up and top-down approach [Harris95], [Guo99]outlines the semi-automatic architecture recovery method called ARM for systems that are designed and developed using patterns.

Tools for Architectural Design


Architecture Language Tools
Module Interconnection language (MIL),
Interface definition languages (IDL),

Tuesday, July 26, 2011

How do we capture qualities in use?

The method promoted by SEI for the collection of qualities in use is in what they call a quality scenario. Before you can talk about the quality in use, you must first describe the use. This is traditionally done with a use case scenario. In this UML diagramming technique, the system is represented as a single box with the user interacting with it. The user interacts with the software system within the context of some task to be done that includes some real world interaction such as the need to look up a phone number so a phone call can be made. The user will search for the phone number in the machine by entering criteria such as perhaps name and location information. This may be sufficient for the machine to respond with the phone number. This highly abstract description of the interaction serves as the basis for further analysis as the functionality is broken down. What happens if there is more than one phone number? What if none is found? Last name first? How is location entered? etc. But more importantly, many aspects of the interaction are left unstated allowing each reader to make their own determination about those unstated requirements. What is the response time? Do you enter and then walk away knowing it may take a day or more for the answer to come back? (This is not unreasonable if the query must be handled by a human at the other end for some countries where good records are not kept or where security may limit who can know the phone number requested.) or, as is more common, the assumption is that it will be an instantaneous response. But even this can be subject to different interpretations. If you are doing call center work with a client on the line and already irate at some perceived fault, even a two second response may be considered far from instantaneous. The business analyst must consider whether response time is something that warrants further investigation and documentation.

Let's say that in the case of the phone lookup, it turns out that what is needed is not a human lookup but an automated box to be connected to a predictive dialer on a call center system. In this case, the business model requires some predictability for the response to achieve ideal overall performance. In fact, the predictability may be more valuable than the speed with better results possible if the response is always 2 sec rather than 0.2 sec for most lookups but 2 sec for some. That detail may make a significant difference in the decisions the designer will make. In this case, the business analyst must note this in the documentation of that use case scenario.

Note that in this last example the specific quality sought, predictable response time, was explicitly stated. But even more important, some measure for that quality was eventually stated. It may have been stated with very simple terms such as "no response will be greater than 2 seconds" or it may have been stated in a more complex way such as "99% of the response will be 2 sec +/- 0.003 sec and fewer than 1 in 100,000 will be greater than 3 seconds." In either case, the way that a tester can construct a Boolean test to ensure that the quality is present is clearly suggested by the measure. In all cases, the stimulus and response for the quality measure matches that of the use case scenario on which it is based.

Earlier, we mentioned that the taxonomy of qualities in use is more academic than pragmatic. The reason for this is that in the human processes of requirements elicitation it can easily lead to unproductive discussions of the taxonomy and the semantics of words used to name these qualities. In the end what is needed is a quality scenario that specifies the quality measure, whatever it is called. To avoid the segues into these unproductive discussions, it is advisable to avoid becoming to concerned with resolving the semantic discrepancies that will come up on the discussions of software qualities and instead drive towards the quantitative specification of that quality given some use case scenario.

Those qualities of the system that are important to the owner but not observable to the user lend themselves to the same treatment. The only difference is that many of these use case scenarios are rarely documented since the scope of the testing effort and extended product life cycle processes are rarely included in the project. Therefore, these specific use cases must be named and identified before the quality attribute can be specified.

The cost and other global attributes of the product do not lend themselves to this treatment. Since they depend on the collection of all the decisions together, they must be handled differently. Later in architecture analysis, we will see at least one technique that can be used that includes cost as a factor in architectural decision making.

What is Quality Software?

Like any discussion of quality, one must first recognize that quality is never a single attribute but a collection of attributes and their relative presence in the item of interest. When we speak of a car that we believe is of high quality, we generally do not mean one where it has a beautiful interior and exterior with flawless fit and finish but suffers from poor acceleration and breakdowns. Yet a small, inexpensive car may have relatively poor performance but a superb dependability record and prosaic but serviceable interior amenities can be seen as a car of quality, although usually stated a good value. What the first example lacks is a good balancing of the various attributes in the vehicle while the second offers good tradeoffs that allow its purpose for use to more nearly fit our needs. So quality is not a single attribute but some collection of attributes that reflect the decisions a designer made when during the design process.

Software is no different. Microsoft has properly taken heat for creating operating systems that seem to regularly crash causing confusion and pain for the consumers who must use them while simultaneously loading them with features that only some minority of the consumers want or need. The driver behind the feature bloat is the marketing engine which is interested in growing market share and selling new products to replace the older while the cause of the software errors is a rush to market and a corporate culture that does not value engineering excellence over profits. What results is a product that is shaped by the forces at work in the design and build. It is a combination of various attributes including price, dependability, functionality and performance in some relative levels. At least at the Software Engineering Institute at CMU, those attributes of software systems are called qualities.

There are different categories of qualities and their treatment varies. For example, the one quality that has historically gotten most of the attention has been functionality. Since this is vital and the very essence of correct functioning, that is no surprise. This owes to the fact that in the decomposition from need to solution a software engineer must eventually express the solution in for the formalism of a mathematical function since that is what a machine will execute. However cost is a much more difficult attribute to tie determine since it is bound up in business processes, management, markets and economics. There are a group of qualities that can be grouped together under the heading of qualities in use. Performance, reliability, usability, security, and availability are qualities in use since they are directly observed by the user. There are another group of qualities that are not seen by the user but are important to the owner of the software system. They include buildability, maintainability, testability and modifiability to name just a few.

What we have just started exploring is a taxonomy of software system qualities. (from wikipedia, "Taxonomy (from Ancient Greek: τάξις taxis "arrangement" and Ancient Greek: νομία nomia "method"[1]) is the practice and scienceof classification. Taxonomy uses taxonomic units, known as taxa (singular taxon). In addition, the word is also used as a count noun: a taxonomy, or taxonomic scheme, is a particular classification ("the taxonomy of ..."), arranged in a hierarchical structure. Typically this is organized by supertype-subtype relationships, also called generalization-specialization relationships, or less formally, parent-child relationships. In such an inheritance relationship, the subtype by definition has the same properties, behaviors, and constraints as the supertype plus one or more additional properties, behaviors, or constraints. For example: car is a subtype ofvehicle, so any car is also a vehicle, but not every vehicle is a car. Therefore a type needs to satisfy more constraints to be a car than to be a vehicle.")
It is always tempting to develop a complete taxonomy. However we will see later that while interesting from an academic perspective, it is unnecessary for the practice of requirements gathering.

Traditionally all of these qualities, except functionality, were grouped under the heading of non-functional requirements. While functional requirements have been gathered with some success in the past, non-functional requirements were largely ignored. We shall see that this has been a stumbling block to achieving Quality software since many design decisions are predicated more on the basis of the non-functional requirements than they are on functional requirements.

So the Quality (big Q) of a software system is the fitness for purpose of the system as measured by the gap between the various qualities (little q, or attributes) of the system and the client's needs.

Monday, July 25, 2011

An Outline for a Course on Software Architecture

This is taken from the Shaw and Garlan book, Software Architectures: Perspectives on an Emerging Discipline

Introduction (3 lectures)
  • Orientation. What is the architectural level of software design, and how does it differ from intra-module programming?
  • What Is a Software Architecture? Constructing systems from modules. Some familiar kinds of architectures. Some common kinds of modules.
  • Classical Module Interconnection Languages. (from procedural to object-oriented)
Procedure Call (3 lectures)
  • Objects. Information hiding, abstract data types, and objects. Organizing systems by encapsulating design decisions, or "keeping secrets."
  • Modular Decomposition Issues. Keyword In Context, KWIC, problem introduction. Considering how to weigh the benefits of various decompositions for a system.
  • Formal Models. Basic notation of the Z Specification Language. The schema calculus.
Dataflow (4 lectures)
  • Batch Sequential and Pipeline Systems. Systems where data flow linearly through a sequence of discrete processing steps. Contrasts executing to completion at each step with continuous flow through a system and incremental processing
  • Tektronix Case Study. Specific example of a pipeline system used as part of a larger application. Example of formally modeling components and connectors.
  • Implementation Using Unix Pipes. The Unix paradigm connects independent processes by dataflow. The organization of the processes, and the style and tools for connection are substantially different.
  • Formal Models for Dataflow. Formal model of pipes and filters. Use of formalism to explain what a software architecture is and to analyze its properties.
Repositories (3 lectures)
  • Databases and Client-Server Systems. Databases and client-server systems use a centralized, persistent store of information. This contrasts with dataflow architectures.
  • Blackboard Systems. Sharing complex knowledge about a problem; making progress when you can't tell in advance what order to impose on the subproblems.
  • Architectural Evolution and Industry Issues. Historically, the requirements of users coupled with advancing technology have produced an architecture evolution from batch sequential systems through pipelines to repositories. Consideration of how industry deals with choices.
Events (2 lectures0
  • Models of Event Systems. Distinguishing implicit invocation from client-server communication and point-to-point message passing. Using formal models to define a general architecture which can be further specified as needed.
  • Implementations of Event Systems. Examines and compares two implementations which enable components to communicate via events. Presents alternatives for the underlying implementation of implicit invocation mechanisms.
Processes (2 lectures)
  • Communicating process architectures. Topologies and techniques for orchestrating multiple, independent but communicating processes to collectively solve a problem.
  • Formal Models for Processes. Introduction to CSP for modeling sequences of execution. Comparison between CSP and Z schema calculus.
Other Architectures (1 lecture)
  • Interpreters, Process Control and Heterogeneity. Two examples of architectures frequently found in practice. Examples of how "pure"architectures often appear combined in implemented systems.
Design (9 lectures)
  • Design Assistance. The selection of a software architecture should depend on the requirements of the application. This example of a system shows how to make the structural design of a user interface explicitly dependent on the functional requirements.
  • Classification of Architectural Constructs. Presentation of a partial taxonomy for architectural styles, components and connectors.
  • Interface Matching.
  • Aesop
  • UniCon
  • Heterogeneity and Mismatched Parts.
  • Information Architectures for Cyberspace.
  • Architectural Languages
  • Patterns and Pattern Languages. Identifying patterns in the use and combination of architectural styles and components, as well as recording and communicating the patterns in useful ways.
*************************
One section that must be added is the documentation of non-functional requirements. Without an agreed upon way to capture the requirements, it is difficult to explain what is meant by quality software or the power these requirements can exert over the shape of the system.

What's Wrong with Being a Progressive?

When I was growing up oh so long ago, we have liberals and conservatives. The liberals wanted to see a fairer world and the conservatives were primarily a party that favored the rich and those who wanted to be rich, that is, pro-business. But both sides were civil, there were interesting debates on the issues facing our society and I always thought most people had a relatively open mind. Back then no one would shun you because you were a self-professed progressive. Those days are long gone. But what is wrong with being a progressive now?

Conservative has taken a much more hard-edged meaning over what I remember. I never met a conservative growing up who didn't believe in a better and different future from what we knew. We were putting men on the moon, there were new inventions every week, everyone seemed to want to be an American and join our parade to the future. Now conservatives not only want to stop all change forward, they want to actually go backwards on many issues. I am really shocked at the amount of money and effort groups are putting into resisting things like the Obama healthcare plan, gay marriage and mosques. Their attitude now seems to be that majority rules until we are the majority and then we'll work to reverse every change the other side made when they had the majority. This is not progress, this is even worse than stagnation. This is just treading water until we run out of energy to stay afloat.

How did we get here? I do not doubt the sincerity of the positions held by the tea party folks. They truly believe in a small government and self-sufficiency. But it is as if they read Ayn Rand and never got beyond fiction. While she makes a compelling plea for a libertarian form of government, she never had an ailing, widowed grandmother, or a worker who lost a limb in an industrial accident as a main character. No, they are not worthy of her praise since they cannot aspire to the Nietzsche superman. You don't need to have much imagination to understand that such pure ideology just doesn't work in the real world. What you create in embracing this idealism is a return to the jungle and the worst human treatment found in the gilded era. This is not something that I aspire to and most of the people who espouse this ideal would not want it either if they really had a chance to experience it.

I think that the main driver for this kind of reactionary position is a response to the extraordinary changes the US has experienced in the past 40 years. We went from envisioning a future so different from the present that would bring prosperity and fulfillment to everyone to fearing that our children will have shorter, less healthy lives than we did, a belief that our stagnating economy will not create the jobs needed to achieve full employment, and social changes that are at direct odds with the strict judeo-christian laws inculcated into society during the post war boom. In looking for the cause, people do not look into the mirror, they look at what has changed and blame that for their deteriorating status.

Chicago was a predominantly white city before WWII. The 1950s brought a flood of southern blacks to the city all looking for a better life in the north. What the white population saw was this black wave over the city and they were not ready to embrace integration. Even the rumor of a sale of a home to a new black owner would cause home prices to tumble as everyone scrambled to sell their house before prices fell. The speed with which this would happen was staggering. Entire neighborhoods would be converted from exclusively white to exclusively black in the span of a couple months. The fear was palpable. Of course this type of extreme reaction could not be sustained and eventually market forces worked their magic and integration, painful as it was for many, eventually began. There was prosperity for all and we all went forward together.

But things began to change under Reagan. While there is always a good argument that government can do harm as well as good, and in some cases do more harm than good, but the reaction to all of the new government regulation of the 70s began to be challenged, if not rolled back. The first to go were many of the limits on financial institutions. This release a great deal of good as many of the Glass-Stegal rules were for a different time and no longer applied to a world-integrated, almost internet financial world. Lowering tax rates became a holy quest and the right still worships at that altar even though most of them have no understanding of the dynamics that made some of those changes necessary or reasonable. It was good then, it is good now and it will be good in the future. Things became watered down to sound bites for the congregation that worshiped at the church of Reagan.

Along with all the other changes in the world, the war-ravaged nations of Europe entered the world markets in the late 70s. Bankers would queue up to lend them money and at first everyone had a good time. Americans saw a flood of new products enter their markets; quality cars, cheaper clothing, innovative appliances. But the downside of all this was to expose the weak underbelly of american manufacturing which had gone soft in the easy days of selling to those war-ravaged countries who had no one else to turn to for heavy manufacturing and capital investment. Instead of investing the profits from those halcyon days in the 60s into innovative (progressive) techniques and more efficient processes, the money went to management salaries, worker salaries and shareholder dividends. Then when faced with stiff competition from the new factories and able workforces around the world, american business could not always compete. (yes, there were unfair trade practices but on the whole I don't believe this changed the outcome). Now instead of enjoying the better value of the new appliances and cars, they found the less expensive clothing and other values of the imported good from China helpful for being able to maintain their lifestyle in the face of a stalled american economic engine.

No one really saw this erosion of the american supremacy as everyone was fixated with the internet and all the other get-rich-quick schemes that seemed to flow endlessly out of New York. With the controls off the markets, they were free to soar to unbelievable highs but now were also go into free fall with staggering losses of wealth that hadn't been seen in a generation or more. It was like cowboy economics compared to the church bingo of the past.

Together with the sexual revolution and women's lib, the right decided that it was all just too much to bear. The problem is all caused because Washington just gets in the way of business and if they just kill that beast everything will be great. So now every change of the past several generations is being questioned and an earlier time that never existed is being held up as some sort of paragon of the "America" that can again be the beacon to the world. The concept that American does not mean financial success, necessarily, but freedom, fairness, and equality don't seem to be enough anymore. Now they don't seem happy unless they kill all the progressives and by that I mean anyone that doesn't fit into their vision of what that fictional past looks like in their mind. I just hope there is someone there that looks a little like me and I fear for us all.

Friday, July 22, 2011

Software Architecture Design and Analysis:SEI's view

I have registered for SEI's Software Architecture Design and Analysis seminar Aug 10,11 in Pittsburgh. I need to refresh myself on their outline so I might as well bring you along for the ride.

Bass et al take up the issue of design in Chapter 7 of their book Software Architecture in Practice, 2nd ed. The chapter is written by Felix Bachmann. In his Evolutionary Delivery Life Cycle, a Preliminary Requirements Analysis results in a small handful of "architectural drivers, ... the highest priority business goals." (p155). Not all of the business goals may have an impact on the structure of the software system however so only those that will have an impact on the architecture are chosen.

One question that comes up in class that is difficult to answer is how to square the interative methodology of the Evolutionary Delivery Life Cycle (and Agile methods in general) with the waterfall approach that is described elsewhere. In short, I cannot except to say that at some point you need to move forward. Hopefully you won't make too many mistakes. But it is clear that some of the earliest architectural decisions constrain those that follow. Once made, they are difficult to revisit without risking a great deal of prior work. My answer has been that it requires judgment and experience to know. Ironically these are the two qualities of the team members that management seems to ignore when they advocate adoption of Agile methods.

The selection of the architectural drivers sets the stage for the beginning of the Attribute Driven Design methodology(ADD). One good aspect of ADD is that it puts the focus on the quality (non-functional) attributes desired of the system and not the functionality. I think this is key and I don't find a satisfying explanation in the text.

At the lowest level there are only functions. Yet what is desired is non-functional. How can this be? Somehow non-functional attributes emerge from a collection of functions. What the ADD does is focus the designer on those non-functional aspects of the system to be delivered and make them reason about how their selection of functional decisions will cause these non-functional attributes to emerge. But I digress from their text...

At the beginning of the design, the designer is faced with the software equivalent of a blank page; a single box with input, and output. Like a surgeon making the first cut, the designer must decompose this single module into two or more with relationships between and among them. The justification for this decomposition must be the furthering of some non-functional attribute.

The book doesn't explicitly say this but after the decomposition the non-functional attribute has either been satisfied by this decomposition or it follows one or more of the newly created modules. So the new modules will inherit a combination of functional and non-functional requirements from this decomposition. The clear implication here is that some combination of functional elements with the correct relations between them create the non-functional attribute. As the decomposition progresses, it eventually reaches a level of decomposition where the designer is no longer concerned that the attribute will not be achieved. One of two things will have happened by that time: either the design no longer pushes non-functional requirements onto decomposed modules leaving purely functional specifications for those modules; or the achievement of the non-functional requirements is self-evident, at least to the designer.

ADD stresses the choice of pattern or tactic to be used to achieve this decomposition. Since these patterns are know to further specific non-functional attributes, this makes sense. I think in my teaching I did not emphasis this quality to functional relationship that is seen in these patterns. From the standpoint of possible research and tool development this is an exciting aspect. While hardly mechanical, once the desired attributes are identified, the system can prompt with possible patterns/tactics that can be tried. The designer can pick the most likely candidate and have the system assist with moving functionality and qualities to the appropriate modules. Some modules will inherit all the functionality or qualities of their parent. But often the functionality or quality will attach to some subset.

Bachmann makes an astute observation on page 158 re the possibility that not all design proceeds strictly top-down. When it is a mature domain with experienced designers, there are probably few surprises that would cause a rethinking of prior decompositions. However this is a high bar and even mature domains when met with novel business requirements can pose challenges. In this case, the decision tree may proceed to a relatively low level (for architecture) before the designer becomes aware of a dead end or an unacceptable trade off. Or perhaps it is unclear whether or which decisions will lead to the achievement of a particular attribute. In those cases prototyping or some other form of contingent design is needed. The design space must be explored so as to ascertain that a particular choice will lead to the needed result.

This chapter helped me answer a question that came up that I did not give a satisfactory answer to: what is the difference/relationship between a tactic and a pattern? Tactic is defined in opposition to strategy. Where strategy is the overall plan which will encompass many operations, a tactic has to do with the execution, the doing. In software this will be the equivalent of implementation. So while architecture deals with the strategic element of the project, the decisions that commit to a course of action are the tactics. I would say that the choice of how to divide a given module, even when that module is the first one attacked, the entire system, that choice is a tactical decision. If it is possible, the subsequent decomposition decisions will be at least roughly known and anticipated. However this definition of tactic doesn't get me where I need be for this explanation.

On page 100 in chapter 5, Bachmann, Klein and Wood define tactic as "a design decision that influences the control of a quality attributes response". This at least gets us into the domain of software design. It also gives me the crack I need for my piton. If the tactic is the decision, that decision will result in some form. The pattern is the form that results. To apply a pattern is to make a tactical decision or to employ a tactic. I can forgive the students for not working their own way through these semantics.

In ADD, a pattern is chosen to enhance the attribute desired. But patterns rarely promote a single attribute and often diminish one while advancing another. Therefore each decision causes an evaluation of the tradeoffs between these attributes. If the designer is lucky, the attributes diminished by the choice are not critical to the success. But if they are, other patterns must be sought to achieve considered. The book offers the example of using an interpreted language to achieve modifiability. While the use of HTML does improve modifiability, it also comes with a performance price.


Thursday, July 21, 2011

Software Architecture: Definitions

I'm trying to create a set of notes to use when teaching software architecture. This post is to work out how to define it.

Software architecture depends upon a metaphor with building architecture for its definition. Like a building, a software system is composed of many pieces in relation to each other. The architecture is not visible by looking at the individual pieces but only apparent when looking at the complete structure. Likewise, there are many ways to describe what is going on within programs or even small collections of programs (class, inheritance, is-called-by relationships, etc) but they only capture individual details, not the overall impression that is made by the complete assembly.

When trying to convey this point, the language gets difficult. We speak of architectural styles when we refer to a building as international style, or prarie style, gothic, etc. What we mean when we say style is that the structure embodies a set of attributes that together provide a category for the expressive elements of the structure. For example, to say that something is romanesque style immediately suggests that the windows do not have gothic points but are instead semicircular at the top. When we see this style of window, we can look for the other attributes. If it has all or at least most of the stylistic attributes of that category, we can say it is a member of that style.

But while the expressive elements of building architecture are relatively approachable for a lay person, software architecture is less so. For example, the 3 tiered architecture style is very familiar now. When we say that a system is 3 tiered, we immediately expect to find at least one, and usually many clients that communicate with one or more servers via a shared communications channel with the data for the system kept in a back-end database. The deployment of the client, server and database are the expressive elements that define this style. This is far less obvious than the shape of a window but workable as a way of discussing software systems. The student must first make this leap.

Sadly, things can go off the track very quickly here. The work on patterns of the 80s mirrors the idea of style. But the scale at which patterns can be used spans from the grand-architectural to the program level code style. Which of these patterns are architectural? Is a pattern any different than a style? To the second point, I say no, a style and a pattern are essentially the same thing but the context does make a difference.

When we are discussing architectural style, we are looking at the most apparent features of the assembly. This may be worked out more by trial and error than by any great theory. The attributes that make the style useful become the defining elements of the style. The shape of the romanesque and gothic windows were not mere ornamental choices, they were a direct result of the way the stresses of the structure are worked out and arranged to give the structure solidity. Likewise the 3 tiered architecture evolved as a way of making a more scalable system than the monolithic mainframe architectures of the past. In a 3 tiered arrangement, the hardware could be maximized for the task; a Sun server for the Oracle database, Windows servers providing application logic and an assortment of clients to support the machines used by the end user.

The central point made by the authors regarding patterns is that a pattern is a logical response to a recurring problem with a solution that has been used repeatedly and has known attributes. That is to say, if I apply pattern A to the problem, I can be assured that it will exhibit properties X, Y and Z. But if I were to use pattern B for this same problem, it would exhibit properties W, Y and Z. I would know this because of the amount of analysis that has previously been done using those patterns. If the properties W, X, Y and Z are the most important properties of the complete system that are desired by the client, then it is reasonable to call these architectural patterns or styles.

Another point that is confusing to the students is that this is distinctly different that an architectural view. Architecture is complex enough that it is impossible to describe in a single representation. For this discussion to make sense it is first important to realize that EVERY building and EVERY software system can be said to possess AN architecture; architecture is not something that can only exist if it is designed in. While a wigwam may not strike some as architecture in this context it is a valid as a Frank Lloyd Wright prairie style home. The first is an example of an organic or historic style that is done unselfconsciously, the second is considered more of an art form which requires deliberation and conscious effort on the part of a designer who is not the inhabitant. Even the humble wigwam is difficult to depict on a single diagram and a modern skyscraper absolutely impossible.

Historically building architecture has evolved a standardized series of depictions which collectively can be considered the expression of the architecture. There is a of course the scale model which helps the client get an idea of the overall shape and appearance of the intended structure. But that is insufficient for the many detailed decisions that the architect may need to convey to ensure that the project comes out as envisioned. There are framing plans, electrical plans, HVAC plans, plumbing plans, plot maps and a great deal more. Each of these plans is a particular view of the architecture. It represents an abstraction of the information that stakeholder has in the project. That abstraction is a representation that enables that stakeholder to efficiently participate in the project.

Software projects are no different. To achieve some anticipated future modifications, a software designer may provide a particular class hierarchy that, if followed, will make an anticipated change later easy to do. To achieve performance, there may be a great deal of attention given to the particular hardware that the various software modules may be deployed to or even consideration given to other software that may run, or be prevented from running, on that hardware so as to ensure the hardware availability needed to achieve that performance goal. The ways these design decisions are captured into artifacts to be shared with other members of the project team are each specific views of the architecture. No one by itself completely captures the architecture but collectively they do.

Another point that will inevitably come up in this discussion is the extent to which architecture is high level design. The answer must equivocate because, yes architecture is certainly high-level but no, it may not always remain solely high level. For example, to achieve a particular usability goal, the team may actually need to experiment with completely coded UI designs and run human experiments to see how people react. This may not be a production grade system, most likely a prototype. But to be sure the final design will have the usability specified the UI will be worked out to a relatively low level leaving very few decisions for the implementors. This is generally not what people think of when they talk about high level design.

What is required is for the architect to determine which of the many attributes desired by the client will be the most critical to the project success and in particular which attributes will require careful tradeoff because of confounding.

I think if a student can be led through these points, the definition of software architecture is complete.


Wednesday, July 20, 2011

What is artificial intelligence? NYT 6feb2011 Richard Powers

This op ed piece was written before Watson's appearance on Jeopardy. While Jeopardy did offer a great challenge to artificial intelligence, I think comedy will up the stakes. When a computer can do what John Steward or Stephen Colbert do with the daily news, I'll really be impressed. Until then I'll be inclined to view it as an exploration into how the human mind assimilate such a large disparate collection of information for efficient retrieval. Intelligence is the adaption to a changing environment. The news changes every day and to remain funny requires skills that I don't think we have any inkling of yet.

When is a tree not just a tree?

When it's a hazard.

I have been in conflict with my neighbor almost since the month we moved in. He's not a nice man. But lately it has mellowed into a form of benign ignorance since we have no reason to talk to each other. There is only one issue which will not go away and that is a large oak tree growing over my bedroom. From any angle, it appears that the tree is on my property but it is actually growing on his side of the fence at a 30 degree angle toward my property. Virtually the entire crown hangs over my property like some kind of pool umbrella.

Now comes the interesting part. Tree law is both well developed and maddeningly vague. There is a near absolute law that allows a neighbor to trim an offending branch at the lot line. Since the trunk of this tree passes over the lot line below the crown it is tempting to argue that I can just cut it down at the lot line and be done with it. There are two problems; first a more recent court decision limiting that right and second the fact that the tree is a protected species.

In the past 20 years there was a precedent set that limits self-help which prevents trimming of a tree that would harm the tree. I saw one site that goes so far as to suggest that even ruining the symmetry of the tree could count as damage. But more reasonably it limits the amount of trimming to 30% and prevents cutting any roots that would cause harm. Since the trunk of this tree would still hang over the property it would do little good in removing the hazard.

The tree is a river oak. That species requires a permit to trim after its trunk reaches 12" in diameter. The county came to take a look and agreed that it could be removed. However only the owner of the tree has the ability to do so.

When confronted with all this, the neighbor replied that the tree hasn't done any damage yet and therefore it is ok. I won't even discuss the speciousness of that position.

So on closer examination the law comes down to two points. Is the tree subject to premature failure? Of course an oak can live for many decades in the right environment. This one clearly cannot due to its angle of growth which places a great deal of stress on the roots and its limited land. As a young tree it has a significant tap root that will give it stability. But as a tree of this species matures the tap root is replaced by horizontal roots which cannot maintain this torque.

If there is no foreseeable damage to be cause by a tree falling, the law is not inclined to coerce action. However when the premature failure of a tree has a clear target it falls into a category known as a hazard tree. Since the trunk and major branches are inches from my roof there can be no doubt that this is a high-value target should this tree fail.

Previously this same neighbor had a group of liquid amber trees, one of which was rubbing his gutter. Whenever there was a light breeze you could hear the gutter pop as the trunk rubbed it. During a storm it broke at that point falling in my yard damaging my gutter and fence. Since he claimed it an act of god he refused to take any responsibility for it despite the fact that he could have and should have know that allowing the tree to repeated rub against his gutter was injurious to the tree. By failing to properly maintain his tree he took on liability for its damage. However the prospect of taking his to court over the damage wasn't worth the trouble.

This prior experience with his makes me wary of ignoring the potential hazard and I am not willing to wait until the tree falls. Where I come from you avoid injury instead of waiting for it to occur and then arguing about it. Perhaps my neighbor comes from someplace outside the galaxy.

Tuesday, July 19, 2011

So called "Christian Economics"

Propriety limits my thoughts about Mr Gary North. If you don't know him you are probably not the kind of "Christian conservatist" he rallies. I put that term in quotes since I don't believe the way he characterizes them they are either Christian or truly conservative. But first a few words about Mr North and his points of view.

When the radical right was demonizing the public service unions in Wisconsin, Mr North was giving them cover with his claim that the Bible is opposed to organized labor, and especially to organized public employees. This kind of thinking comes out of the Christian Reconstructionist movement which grew out of the philosophy of R. J Rushdoony who died in 2001. At least as characterized in the NYT, they believe in the creation of a Christian theocracy under old testament rules that is highly libertarian. Until that day they want to home school their children and worship freely. While Mr North has his parting with Mr Rushdoony, he still believes that the Bible forbids welfare programs and is opposed to all forms of inflation and requires a return to the gold standard. Here is a link to his economics text for home schooling "Christian" families:

When it comes to criticizing his positions I barely know where to begin. At the outset it is offensive the way he claims the mantle of "Christian" and wraps himself in that flag. One of the most basic values in Christian education is that of charity and that is nowhere to be seen in his writings. In fact he seems to continue the long tradition that started with the reformation of finding ever more reasons to splinter the Christian community along progressively less spiritual lines. Where is the essence of giving to Caesar what is Caesar's in his economics? I'm sorry, but in my Sunday school we were taught the importance of our soul not portfolio management. I've grown up believing that these are two irreconcilable worlds.

When we lived in small villages where we knew people personally, it was relatively easy to recognize those among us who were hurting and reach out to them with the help of the one institution that dominated our lives; the church. (Of course here I am talking about a western european tradition that does not apply to all times and places. However that is the model I was given and I have found how to adapt it to my intellectual needs.) We now live in a world where we live in social strata keeping us comfortably removed from the poor people who support our needs. The cheap clothes we get at Macy's are made in Asian factories and we hope that most of them are not beaten when they make mistakes and waste materials. Or walk on rugs made by poor south asian women who, when they have no one to watch their children may be forced to give their children small amounts of opium so they can work their 14 hour days without interruption. It is easy to think that we are enjoying the fruits of our own labor when we ignore the interconnectedness of our world. So unlike North and his ilk, I am willing to hesitate before distancing myself from the workers who keep me where I am.

Me and My Algorithm, Seth Freeman, nyt op ed

This op-ed piece really has nothing to say about algorithms in the computer science sense of the word. Apparently in common usage algorithm is now related to those pieces of code that peak over your shoulder at your emails and offer suggestions for things it thinks you want. The author mentions how when reading an email he got from a friend that mentioned holocaust deniers it helpfully offered adds for holistic dentists. This kind of simple word match and weighting is not what we think of when we are preparing to learn or teach algorithms. But the piece is fun for other reasons.

When I taught this spring, one of the things that made the class animated was the discussion of whether cloud based email services should be doing this kind of analysis on our emails. I could argue that it was merely a machine and that no human ever saw their words but it gave them little comfort. They were profoundly disturbed by this. There is a clear collision between the desire to monetize a free service with new and improved advertising and people's sense of privacy.

In the end the article doesn't even begin to suggest the depths to which modern advertising can go in gathering, analyzing and categorizing us so as to better match their pitches to our needs and and ability to pay. Someday soon some poor schmuck will wake up to find that the ads next to his emails are no longer selling Benz's but Corollas since he just got sacked last week. Ain't that a kick in the pants.

Modernist Cuisine: The Art and Science of Cooking by Dr Nathan Myhrvold

I think I need to reconsider how I apportion my portfolio. When the NYT reviewed this mammoth work 9mar2011 it carried a retail price of $625 and an online price of $467.62. But Amazon shows it out of stock and others who are willing to part with it are asking $1200. Who knew that there was this good a market in the arbitrage of objects that are said to be obsolete?
If I had the time, money and space, I'd surely be the first in line to get this book. But at 6 volumes, 2438 pages, 1522 recipes, and just under 40 lbs, I'll pass. Yes it is impressive that they spend 5 years and many millions of dollars with a staff of 46 to create this encyclopedic survey of food. But no matter the incredible photos or the cutting edge research, this will find many professional or wannabes who will buy it and ponder whether to invest in a rotary evaporator to try out a recipe. For me, I'll just hope that one of my friends will buy it and let me come over and look at the pictures. Seriously, can I come over and look at the pictures?



Researchers show how a car's electronics can be taken over remotely; nyt 10mar2011

I found this article chilling. Computer scientists from UC San Diego and U Wash reported that a hacker with only moderate skills could gain remote access to someone's car and take over basic functions including control of the engine. While no known cases of this kind of hacking have been reported, it is a cold comfort. It isn't as much that I am afraid someone will cause my car to hit the brakes when I'm going 80 down 80 (I'd never do that) or even pop the lock and make off with my car when its parked in public. What I find most upsetting about this is that entire staffs of engineers worked on a system that had cell phone connectivity and never stopped to think about the kind of security that the software needed to avoid this kind of malicious tampering. No one. At least no one that management listened to.

Losing control of our cars is bad enough. But this past year has seen a flurry of reports of the security of major portions of our infrastructure like the electric grid and transportation systems. Ghansah at school has the right idea after all. Software security awareness and response is a wave that is coming ashore.

David Brooks in Today's WSJ: The Road Not Taken

David Brooks is my hero. He is smart, articulate and not completely conservative. Today he write about how the repubs are squandering an opportunity to do some real good for the country. I am no fan of them but when they have a good idea I'll agree. So before my house gets egged, let me explain.

To have a social safety net we must fund it. I myself am very comfortable paying the taxes needed to fund it. However there are many people out there who are short sighted enough to believe they are self sufficient and will never need the net. For them, it becomes an issue of paying for someone else who cannot (or will not in their opinion) manage their affairs properly. They probably took Atlas Shrugged as their own personal morality play. I can't change their opinion. But the congress people they send to Washington already know that these constituents will be the first to clamor for government assistance when a tornado blows down their home or a hurricane washes away their second home on a barrier island. Their extreme position that they don't need the federal government is the worst kind of hypocrisy.

Here is where there is an argument that should be made that is not. There is no doubt that the federal government oversees a very significant flow of cash. Where the pubs could get a rise out of even me is the efficiency with which this money is used. There are far too many overlapping federal agencies that oversee sprawling empires that often lose sight of their core missions. Congress too is guilty of passing far too many laws that are poorly written and giving the assignment to oversee the enforcement of those laws to agencies that they then refuse to properly fund. This must stop. Our system of government in 4 year increments will never allow for the kind of sweeping changes that are needed to rationalize the federal government's mission. An in an environment in which a significant proportion of American's don't even understand the federal government's contribution to our economy and society will never accept it no matter the size.

So to see us on the brink of a bi-partisan deal that would bring us closer to a balanced budget and right even a few wrongs in those sprawling agencies and vast rule books and then pull back is just galling. What is wrong with those people anyway?


Monday, July 18, 2011

Too Much Programming Too Soon? by Mark Guzdial and Judy Robertson

OK, I'm a little behind in my reading. But this is just as important as the day I read it. I'm talking about the article of this name in the March 2010 CACM where the authors discuss Mark's book "How We Teach Introductory Computer Science is Wrong". (http://cacm.acm.org/blogs/blog-cacm/45725-how-we-teach-introductory-computer-science-is-wrong/fulltext) Both the article and the book are an indictment of the practice of doing minimal instruction in the teaching of programming. The research cited suggests that showing fully worked examples of programs significantly improves the speed and quality with which students learn how to program.

At the risk of reading too much into this fact I have always had the feeling that teaching programming has a storied history of a form of hazing. Perhaps because no one really knows how best to teach it or we all think that somehow we can teach it by lecturing and talking about the atomic syntax. But the reality is that students are always flumoxed at figuring out how to assemble the pieces. This "minimally guided instruction". You would think that the text books would do the job of providing the worked examples that this article suggests but I have been disappointed in most of the texts I've looked at. Ironically this is exactly the same suggestion that I have gotten from advanced students. So what could be going on the heads of these students that makes showing them worked examples help them learn? I have an idea.

Since the 1980s there has been way too much ink spilt over the discussion of patterns. While I think some of the discussion of pattern languages a bit over the top, there can be no doubt that there is something very important in patterns. Instead of teaching the core instructions of the language with their formal syntax, what we really need to do is teach them basic problems solved by computers and the patterns of language that solves that problem. I believe what this does it allow the student to learn by induction. By showing how to solve one problem we reasonably expect the student to learn how to solve a similar problem. If we show how to read a number from the keyboard, we certainly expect them to follow that pattern when they must read another number. For the gifted students, we expect them to figure out how other data types are read. This sounds very close to the thesis of one of my favorite books; Metaphors We Live By.

Metaphors We Live By was written by Lakoff and Johnson which has gone through many printings since it was published in 1980 and imho a book everyone should read. The central point is that metaphor is not merely a poetic device but a key insight into how we learn new things. This fits perfectly with Mark's suggestion that we should show a complete working program first and then ask the student to first make a small change and then quickly advance the distance between the example and the problem.

But then that's just me...

College The Easy Way, Bob Herbert NYT, 5Mar2011

In this column Bob summarizes a book "Academically Adripft: Limited Learning on College Campuses" by Profs Richard Arum of NYU and Josipa Toksa of UV. He quotes the book, "Many students come to college not only poorly prepared by prior schooling for highly demanding academic tasks that ideally lie in front of them, but--moretroubling still--they enter college with attitudes, norms, values, and behaviours that are often at odds with academic commitment."

Wow.

I don't know that I agree with this dire assessment of the current student. Both as a new undergraduate oh so many years ago and teaching relatively new undergraduates this past year, I don't see this vast change in the youth of america. However I can see where these authors get their thesis. According to Bob, "the authors cite empirical work showing that the average amounts of time spent studying by college students has dropped by more than 50 percent since the early 1960s." The study is available at highered.ssrc.org. I am not going to take the time to verify it, but my suspicion is that some societal trends are reflected in these numbers, not some fundamental change in the students.

First, I believe that far more high school students aspire to attend college than in the 1960s. There has been a drumbeat for my entire life telling young people that the key to a better life is education. This has had an affect on student and parent alike with their willingness to dig deeper to afford the inexorably rising cost of education. So demand, I suspect, is much higher now than it was back then.

Second, the natural response of educational institutions to an increase in demand in a market that has been relatively inelastic is to increase supply. At least here in California, there is no shortage of educational options from community colleges to mediocre universities to some of the best schools in the world. They range from publicly financed state schools, to for-profit institutions to prestigious private universities. Let's not kid ourselves, these institutions compete against each other whether they want to admit it or not. And internally they are very concerned about how many they graduate. What student or parent will find a school attractive that graduates a very small percentage of their accepted students? Once you are in there is a lot of pressure to get the student out with a degree.

Has there been a diminishment of the critical reasoning skills of our students in this period? I confess to having a inclination to agree. For what I need to teach, critical reasoning and abstract thinking are vital. I can accept that not everyone will have those skills but I am already working with a class of students who have presumably already demonstrated this ability over many semesters of work. I am not sure I always see this demonstrated in their written product. I have attributed it more to the lack of the kind of deep intellectual writing that is required to succeed in academia or a profession tilted toward the liberal arts. I have always thought it was a reflection of the kind of oral tradition TV, radio and internet give us and away from the book, academic paper and dissertation that is the academic tradition.

But maybe their right and I'm wrong.

Here we go again...

While I posted a few things several years ago, I will start again. In this period of time I completed my master's in software engineering and haltingly started a new career in teaching. This spring I taught two sections of introduction to computer science at a community college and one section of software architecture, graduate level, at the nearby university. I will use this blog to gather my thoughts about these experiences while I have the time over the summer and lay down some short essays on topics I may want to pickup on again later.

Dear reader, I must warn you that my interests range far and wide. I will do my best to give each posting a title that will clue you into what that one will be about. In particular, I will be cleaning out my stash of accumulated clippings and using this blog to record the clipping reference and why I found it interesting. I can only hope you will also find a few interesting.

And so it begins (again)...