February 23 - Keynote titled OSGi in the Enterprise: Agility, Modularity, and Architecture’s Paradox
March 22 - 25 - Tutorial on Modular Architecture
June 14 - 17 - Sessions titled Turtles and Architecture and Patterns of Modular Architecture
July 26 - 30 - Two sessions on rich mobile applications and one on agile development. Half day tutorial on software process improvement.
Right on. I just won a Best Buy drawing worth $1000. Either that or I won a shiny new virus by clicking the link. Hmm...what to do. 2012-08-29
The opinions expressed on this site are my own, and not necessarily those of my employer.
Yesterday’s conclusion of Programming with the Stars at Agile 2009 was an awesome event. A huge thank you to Jeff Nielsen and Joshua Kerievsky for coordinating the event and pulling together the great list of judges and contestants. A huge thank you to the judges and contestants, as well. All made quite a commitment to the event. The final day of competition was the free style day, which meant the contestants could choose their exercise, and were given six minutes to strut their stuff.
The first pair of James Grenning and Kenrick Chien opted for a learning day. They provided a great overview of how a pair might learn from each other through some simple exercises. On numerous occasions, they jumped at the opportunity to suck up to the judges. The highlight came when Grenning, who works on a Mac, chose to jump into a virtual machine running Windows in an attempt to come under the good graces of Judge Newkirk who works at Microsoft, by stating:
Finally an opportunity to work in a Microsoft environment
Judge Marick, however, noted the importance of this playful attempt to suck up when he commented on the important role that VMs will have on the development environments and how developers will leverage them in developing software going forward.
Next up was Gerard Meszaros and Ola Ellnestam, who put on an excellent display of pairing. In the final judging session, Judge Marick jokingly expressed his concern over the obvious corrupt nature of the judges. The final judge voting gave the Meszaros pair a slight lead over Grenning and Chien. But the competition wasn’t complete.
To conclude, the audience got a vote, and while the votes were being tallied, a bonus round commenced. Each pair was required to disable their mouse and using only keystrokes, create a program that printed “Hello Programming with the Stars” to the console. While it appeared Grenning and Chien finished their coding exercise first, it took them a bit longer to execute the program. Possibly because C was their language of choice? Maybe, maybe not. Judge Marick noted that while the two contestants were completing the bonus round, he and Judge Newkirk tried the same exercise using Ruby, but that it took them longer than either of the two groups of contestants. He explained this pretty clearly by joking,
It’s clear that we’ve now proven what statically typed language advocates have been arguing in favor of for quite some time. Java and C are much better languages for mission critical applications.
In the end, it was a great week for Programming with the Stars, which was ultimately won by the Meszaros and Ellnestam pair. Here’s hoping that Agile 2010 serves as host to this fun competition!
After a great first day, day three at Agile 2009 is a wrap. I have some general observations about the conference, but I’ll save those until next week to offer an overall recap. I missed out on most of Day 2 due to a pretty full meeting schedule. I was hoping Day 3 for me would begin with listening to Neal talk about Emergent Design and Evolutionary Architecture, but unfortunately I wasn’t able to make that session. I’ll have to track him down.
I started the day attending the third session of Programming with the Stars. Jeff and Joshua kickstarted today’s competition with a song that was written by Joshua titled, “50 Ways to Leave your Debugger”, which was a variation of Paul Simon’s song. Yes, they sang it. No, it wasn’t pretty. Sorry guys.
There were only three of the original six teams remaining, and each team was provided six minutes to complete their agile development task today (as compared to the three and a half minutes on day 1). It was pretty obvious that the pairs were starting to figure out the Stars competition, as they put on a much more polished performance today. For those who care about completely irrelevant statistics, there were 2 Macs and what appeared to be a Dell netbook left in the competition.
The programming topic today was story test driven development. For those not familiar with story TDD, it’s essentially starting off development with high level customer or acceptance tests instead of low level unit tests. Gerard and Ola put on a great performance and were the clear winners of the day, meaning they were granted immunity from getting cut. That brought it down to James and Kenrick versus J.B. and Llewellyn, and the initial crowd vote was close enough to require two separate recounts. Eventually James and Kenrick won out, pitting them against Gerard and Ola for the final day of competition. Some highlights of the competition include:
J.B.’s comment on their inability to finish the exercise in stating, “If we’d have done it in Ruby like we wanted, we would’ve gotten done.”
James great explanation of the home automation story he and Kenrick would be coding, along with very clear descriptions of exactly what they were doing at all times.
Gerard simulating a completely random experience when drawing their story from a stack of cards. Only later was it revealed that each card contained the same story.
The final day of competition promises to be an exciting event. Later this afternoon, I took in a session titled “Scrum and CMMI: From Good to Great - Are You Ready-Ready to be Done-Done.” I was expecting a bit more information on the relationship between Scrum and CMMI, but this session didn’t follow the title all that closely. The session thesis was pretty simple, though the numbers hard to believe, and the concept not crystal clear. By institutionalizing Scrum across all project teams, two teams within an organization were able to quadruple their productivity, and I discovered the intent of this session was to explain what they did to achieve such dramatic gains. That’s cooler than CMMI anyway, right?
Jeff Sutherland wasn’t listed as a speaker, but there were times when he hi-jacked the discussion and made some “salesy” claims about Scrum increasing productivity and decreasing time-to-market without offering a lot of insight as to how. From what I gathered though, they did this through two very simple concepts - by ensuring they were Ready to start a Sprint and were able to declare themselves Done after a Sprint. I never got a good sense of the attributes surrounding Ready, except that the team took great care in preparing the product backlog. Done was pretty easy to understand - basically all tests must pass. There were some very useful tidbits of information, which include the following:
Along with a few great hallway conversations, a good third day of the conference. Looking forward to tomorrow.
The first day of Agile 2009 is in the books. While it’s great that the conference is packed with amazing sessions, it also means that it’s not possible to take it all in. Today’s highlights for myself included Programming with the Stars and the Software Craftsmanship tutorial led by UncleBob.
The Stars contestants did a great job and it was interesting to see the pairs work through some simple problems in such a short timeframe. It’s amazing what they were able to accomplish in only three and a half minutes. Languages represented in the contest included Java, Groovy and C. It’ll be a lot of fun to watch this play out through the remainder of the week.
Immediately after Stars, it was time to talk about the craft of software development. UncleBob is always an entertaining speaker who brings a lot of energy and passion to the stage. He began by highlighting his three laws of agile development (while also citing the sources from which he culled these nuggets of wisdom, which I have omitted here).
He then went on to talk about general best practices, including some pretty thought-provoking comments. He spoke of incremental improvement, never being blocked, avoiding big redesigns, going fast demands going well, and the importance of clean code. The discussion certainly resonated with me, and the irony in which he presented the material kept the atmosphere light. Two of his quotes caught my attention.
When talking about the importance of TDD versus fancy architecture and design documents, he stated:
If you have a great design, you’ll still be afraid to change the code. If you have a suite of tests, you won’t.
This isn’t to imply that design isn’t valuable, but that our emphasis on spending a lot of time designing the system might be a big misguided. Instead, the real measure of progress is working code and a rich suite of tests means we have the courage to ensure that code remains clean. Then, when talking about testing, he stated:
Manual tests are like me handing you the source code without a computer and telling you to execute it.
He acknowledged that some manual tests are important, but that most manual testing should be exploratory. Acceptance tests, unit tests, and many other forms of tests can, and definitely should, be automated. All in all, a great kickoff day for what stands to be a great week.
There are lots of benefits to modularity, some of which I discussed when introducing modularity patterns. But here’s a simple example, which serves as a prelude to some upcoming posts explaining a few of the patterns.
In the diagram at right (click to enlarge), the top left quadrant shows a sample system with a relatively complex class structure. When change occurs within a single class, shown in red in the bottom left quadrant, understanding the impact of change is difficult. It appears possible that it can propagate to any class dependent on the class highlighted in red. Assessing the impact of change requires that we analyze the complete class structure. The ripple effect appears significant, and change instills fear.
But if the system is modular with classes allocated to these modules, as shown in the bottom right quadrant, then understanding the impact of change can be isolated to a discrete set of modules. And this makes it much easier to identify which modules contain classes that might also change, as shown in the top right quadrant. Change is isolated to classes within modules that are dependent on the module containing the class that is changing.
This is a simple example, but it serves as evidence of the need for modular architecture, and illustrates one reason why modularity is so important. Modularity makes understanding the system easier. It makes maintaining the system easier. And it makes reusing system modules much more likely. As systems grow in size and complexity, it’s imperative that we design more modular software. That means we need a module system for the Java platform. It means that module system shouldn’t be shielded from enterprise developers. And it means we need to understand the patterns that are going to provide the guidance necessary in helping us design more modular software.
This fall, I’ll be speaking at a few conferences on agile architecture. Interestingly, a lot of the ideas expressed in these talks are not new to me. I gave a session titled From Code to Architecture at the now defunct SD Expo back in 2006 (give or take a year). Later that year, I gave a talk on Agile Architecture at SD Best Practices. Now, however, the technology is starting to catch up to the concepts I discuss in the presentations, making them easier to apply. The conferences I’m attending and the sessions I’m leading are listed below:
Session details vary slightly, but the general theme is consistent. To get a feel for the theme, take a look at a few of the blog entries where I’ve expressed my ideas surrounding agile architecture. If you happen to be attending any of these conferences, please feel free to seek me out. I’m always interested in a conversation on the topic. The blog entries are listed below.
In The Two Faces of Modularity & OSGi, I talked about the OSGi runtime and development models. The development model has two facets - a programming model and design paradigm - that impact how organizations will use OSGi to build more modular applications.
In Reuse: Is the Dream Dead, I talked about the failed promise of reuse. In stating that maximizing reuse minimizes use, I examined significant impediments to reuse and use - component dependencies and context dependencies.
Here, I talk about the benefits of modularity, focusing exclusively on the OSGi design paradigm. I review modularity’s relationship to reuse and touch on how modularity helps ease maintenance and improve extensibility. To wrap up, I present some modularity patterns that help balance module weight and granularity and make modules more reusable and useable, while also making the system easier to understand, maintain, and extend.
Almost all principles and patterns that aid in software design and architecture address logical design. Identifying the methods of a class, relationships between classes, and a system package structure are all examples of logical design. Since most principles and patterns emphasize logical design, it’s no surprise that the majority of developers spend their time dealing ony with logical design issues. Other examples of logical design include deciding if a class should be a Singleton, determining if an operation should be abstract, or deciding if you should inherit from a class versus contain it. Developers live in the code, and are constantly dealing with logical design issues.
Making good use of object oriented design principles and patterns is important. Accommodating the complex behaviors required by most business applications is a challenging task, and failing to create a flexible class structure can have a negative impact on future growth and extensibility. But logical design is just one piece of the software design and architecture puzzle. The other piece of the puzzle is physical design. Physical design represents the deployable units composing a software system. Physical design is equally important as logical design, and physical design is all about modularity.
Almost all discussions on modularity mention reuse as a prime advantage. But there are other advantages, of course. There is a reduction in complexity that helps ease the maintenance effort. Cohesive modules encapsulate behavior and expose it only through well-defined interfaces. Because modules are cohesive, change is isolated to the implementation details of a module. Because behavior is exposed through interfaces, new modules containing alternative implementations can be developed without modifying existing modules. There are other benefits to modularity that extend beyond design to runtime, too. But from the design perspective, modularity helps increase reuse, ease maintenance, and increase extensibility.
Time for an example. Assume we define an interface to decouple client classes from all classes implementing the interface. In doing this, it’s easy to create new implementations of the interface without impacting other areas of the system. The principle surrounding this idea is the Open Closed Principle - systems should be open for extension but closed to modification. Logical design makes extending the system easier, but it’s also only half of the equation. The other half is how we choose to modularize the system.
Let’s assume the interface we’ve created has three different implementations, and that each of the implementation classes have underlying dependencies on other classes. We’re faced with a contentious issue. On one hand, grouping all the classes into a single module guarantees that change is isolated to only that module (ie. easier to use and maintain). If anything changes, we’ll only have one module to worry about. Yet, this decision results in a coarse-grained and heavyweight module (ie. harder to reuse), and a desire to reuse a subset of that module’s behavior leaves us with one of two choices. Duplicate code or refactor the module into multiple lighterweight and finer-grained modules. In general, logical design impacts extensibility, while physical design impacts reusability and useability. Well…this is only partially true.
As we refactor a coarse-grained and heavyweight module to something finer-grained and lighter weight, we’re faced with a set of tradeoffs. In addition to increased reusability, our understanding of the system architecture increases! We have the ability to visualize subsytems and identify the impact of change at a higher level of abstraction beyond just classes. In the example, grouping all classes into a single module may isolate change to only a single module, but understanding the impact of change is more difficult. With modules, we not only can assess the impact of change among classes, but modules, as well.
Unfortunately, if modules become too lightweight and fine-grained we’re faced with the dilemma of an explosion in module and context dependencies. Modules depend on other modules and require extensive configuration to deal with context dependencies! Overall, as the number of dependencies increase, modules become more complex and difficult to use, leading us to the corollary we presented in Reuse: Is the Dream Dead:
Maximizing reuse complicates use.
Creating lighter weight and finer-grained modules increases reuse but also increases module and context dependencies, while creating fatter modules decreases dependencies but also decreases reuse. Modules that are too lightweight provide minimal value and may require other modules to be useful. Modules that are too heavyweight are difficult to reuse because they do more than what the client needs. Of course, there are other challenges beyond reuse.
Coarse-grained and heavyweight modules may do a good job of encapsulating change to a single module, but understanding the impact of change is more difficult. Conversely, fine-grained and lightweight modules make it easier to understand the impact of change, but even a small change can ripple across many modules. How do we deal with this problem, especially for large software systems where these challenges are even more pronounced?
Large software systems are inherently more complex to develop and maintain than smaller systems. In addition to increasing reuse, breaking a large system into modules, makes the system easier to understand. By understanding the behaviors contained within a module, and the dependencies that exist between modules, it’s easier to identify and assess the ramification of change.
For instance, software modules with few incoming dependencies are easier to change than software modules with many incoming dependencies. Likewise, software modules with few outgoing dependencies are much easier to reuse than software modules with many outgoing dependencies. This tension surrounding module weight and granularity is an important factor to consider when designing software modules.
Today, frameworks like OSGi aid in designing modular software systems. While these frameworks can enforce runtime modularity, they cannot guarantee that we’ve modularized the system correctly. Correct modularization of any software system is contextual and temporal. It depends on the project, and the natural shifts that occur throughout the project impact moduarlity. I examine our willingness to embrace this idea when discussing Agile Architecture.
Modularity patterns provide guidance and wisdom in helping design modular software. They explain ways that we can minimize dependencies while maximizing reuse potential. They help balance module weight and granularity to make a system easier to understand, maintain, and extend. For those who have attempted to design modular software, it’s common for modularity to drive the logical design decisions, which I briefly explained when discussing the SOLID design principles.
Below are a list of modularity patterns that I’ve used on past projects. These patterns also build atop proven object-oriented design concepts. Right now, the list only includes the pattern descriptions, and provides no detailed explanation. Hopefully, you can infer from the name and brief description the general intent of the pattern. Some may be intuitively obvious, while others less so. In the meantime, I welcome your feedback, your questions, and your criticisms, as I work to provide more detail going forward.