The Ones Most Adaptable to Change Survive.

By Mimi Dionne posted 11-02-2011 23:34


Sustaining adequate quality control while assembling your electronic records management system (ERMS) is a huge piece of the implementation.  Thanks to the slightest provocation, your ERMS will break. This has dangerous long-term repercussions for the team.  When the level of quality drops below the expected value of the software, unfortunately it is terrifically difficult for the reputation of the software and the implementation team to recover from perceived low quality. 

How do teams sustain the internal quality of software development?

  • Sustainable pace
  • Early identification of internal quality problems
  • Close collaboration
  • Refactoring
  • Small batches of work
  • Defining technically done
  • Potentially shippable product increments
  • Single work queue

What is a sustainable pace? According to the Agile Manifesto, “Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.” In other words, there is danger in taking on too much work.  Teams can’t deliver quality and quantity at a fast pace indefinitely.  So, retooling occurs—but sometimes it isn’t easy to see the long-term implications of small retooling.

SO, what can you use to identify internal quality issues early?

  • Unit test frameworks
  • Static code analysis
  • Continuous integration

What’s a unit test? Here’s what it’s not:

  • If it talks to the database
  • If it communicates across the network
  • If it touches the file system
  • If it can’t run at the same time as any of your other unit tests
  • If you have to do special things to your environment (such as editing [configuring] files to do it)

What’s static code analysis? Activities that range from validating code formatting rules  to finding design issues in the code. What’s continuous integration? All members of the team work from a single common source repository to execute an automated build and verification process of small batches. 

Close collaboration means physical closeness—as in, the implementation team should sit within 40 feet of each other. Refactoring is another word for amending user requirements in small batches of code as your Power Users interact with the software (how often do end users say to you in the middle of the project, “now I need it do this”).  Defining “done” is a significant accomplishment and certain steps can get you there:

  • Brainstorm the taxonomy, folksonomy, ontologies. Write down object definitions essential for release.
  • Separate out objects or artifacts that cannot be updated or finished.
  • Capture obstacles—be honest. Identify problem areas in the hierarchies (and the departments that own them). Remember ERMSs are highly political.  Be gentle with everyone involved. As implementation leader you’re looking to resolve issues so you may have a more predictable project.
  • Team commitments: the whole team must conform to the Definition of Done established by group consensus. Story board it. 

Remember you want thin, vertical slices of functionality—they allow the team to develop an ERMS that is fully tested, integrated into the system architecture, and documented.  Finally, create a single work queue.  Prioritize the work into high, medium, low (or similar).  Don’t forget to make room in the WBS for the three varieties of bugs: enhancements, or requests for improvement; defects, or system faults; and fires, or defects that must be fixed immediately because of impact.

Above all, create a beautiful progress dashboard for your Project Sponsor (read Tufte first before you design it).  Raise eyebrows: cite lines of code amended, quantify complexity, and number events.  Establish a formula that calculates technical debt and transform it into a pictorial overview.  If you need help with this, give me a call.

Thanks to author Chris Sterling. See my previous post for book details.

#agile #scrum #SharePoint #ElectronicRecordsManagement #softwaredevelopment