Category Archives: Designing software

To estimate or not to estimate? Not the right question…


Over the past few years I’ve seen an increase in articles and posts about whether or not to do estimation (of cost, schedule and effort) for software development projects. This is especially true when agile/iterative methods are used to develop software for which requirements are not readily known in advance.  There are actual “movements” set up to prove that estimating in and of itself is bad for software development.  At the same time, I’ve worked done more and more work for clients related to software benchmarking (to find best-in-class methods, tools, and combinations to develop software) and estimation (including price-to-win estimating.)  I’m now convinced that “To estimate or not to estimate?” is simply the wrong question – or at least a premature question for many companies.

Estimation is often viewed as fundamental to software development (and any other development projects or programs) as are ingredients to cooking or oxygen to life.  While we might wish to discard or discredit the practice of estimation as an inconvenience and even the reason for software “failures” –(Sidenote:  The Standish group’s annual CHAOS reports cite lack of “on-time” and “on-budget” software delivery as rationale for declaring project failure; both of which would disappear as factors if estimating was eliminated) – the truth is that C-level executives need a level of confidence (based on estimates) to bound their investment in new initiatives, no matter how much faith or confidence the executives have in the development teams’ ability to deliver.  In my humble opinion, project managers MUST  develop skills to do solid, reliable project estimates if they are to survive (and thrive.)  But this is where things often fall apart – estimation is not seen as a discipline based on solid data (in part, because some organizations do estimating haphazardly based on bad data, poor models, flawed assumptions, premature input values taken as fact, among other factors.)

This does not include those organizations where the mere notion of projects (being a temporary endeavor intended to deliver an identified product, outcome, or service such as a piece of software) is like a foreign language.  When I teach courses according to the Project Management Institute’s Project Management Body of Knowledge (PMBOK(R)), it’s not uncommon to find IT pros who profess that project management is not needed because their work is bounded solely by calendar months and the number of full-time-staffers.  The idea that work should be managed towards a specified outcome (with goals, objectives, timelines, milestones, deliverables and a formal end) just doesn’t fit into their paradigm, even for those involved in developing advanced technology solutions.  I’m excluding these companies because projects (and estimating cost and schedule) are actually beyond their comprehension, as is productivity, project comparisons or process improvement.

Given the premise that “to estimate or not to estimate” is the wrong (or at least a premature) question – then what are the right ones?  Here’s a short list:

  • If we do an estimate, do we know what are the correct input variables (and values) we should use?  (i.e., Some idea of scope, non-functional requirements, constraints, goals, project environment, etc.)  Garbage in equals garbage out.
  • When estimating, do we have access to correct and appropriate historical data on which to rely? (i.e., does the historical completed project information accurately depict what actually happened on the project? Often up to 40% of true project work effort is not recorded – or it is recorded inconsistently.)  Incomplete or incorrect historical data make for poor comparisons, and even worse estimates.
  • Are the estimating models we propose, appropriate for the industry and application?  (i.e., in construction, it would be folly to use a home building model for a hospital construction or bridge construction project, so too with software.)  Every model, no matter how advanced, needs to be calibrated for the organization using it.
  • Do we know enough about the object of estimation? (i.e., if it is simply an idea about an outcome without any idea of component programs or projects, a “guess”timate or rough-order-of-magnitude may be the only possibility until more data are known.)
  • Are the estimating exercise/practices paid “lip service” by management? (i.e., does management summarily cut every estimate in half, or dictate due dates that override those of professional estimators?)
  • Does the organization take (software) measurement seriously?  (i.e., how are project measures and metrics collected – if adhoc, inconsistent, without formal processes or procedures to validate the quality of project data, then estimating will likely be equally inconsistent)

These are just a few of the important questions that need to be addressed – before we attempt to estimate and rely on the results of the practice.   When estimating is done without proper planning, discipline and consistency, the results will be unreliable and even worse, downright wrong.

In IT as in life, if you’re going to invest in an endeavor (such as estimating), take the time to do it right the first time, or don’t bother doing it at all.  And that, really answers the question of  “to estimate or not to estimate.”

What other questions are critical to ask?  What do YOU think?

Advertisements

10 Steps to Better Metrics… for Everyone


The more things change, the more they stay the same – especially when it comes to initiatives that involve cultural change. Measurement is a perfect example – and I’m not talking purely about “software metrics,” rather measurement in any industry.

When you take a business that has traditionally “flown by the seat of its pants” (in other words, it is a monopoly of sorts or it has made money in spite of itself) and start to keep track of what’s going on, people have issues.  The first step often is to simply measure anything that moves – data that are easy to capture – and then try to figure out some sort of conclusions or action plans.   In IT (Information Technology) the landscape is littered with discarded data from failed measurement initiatives.  Data in and of themselves are not bad, IF the data are used appropriately and in the right context.

I recently wrote the following article for Projects at Work based on concepts I first observed nearly 20 years ago, and they are as valid today as ever before.

As a consultant, I LOVE to work with companies who want to succeed with measurement. If you are tasked with starting metrics for your company, give me a call – maybe I can give you some ideas to save you time and money – and succeed with metrics!

Send me an email or leave a comment – measurement is too important to leave to chance.  (Let me know if you’d like a full copy of this article!)

Cheers,
Carol

ProjectsAtWork - 10 Steps to Better Metrics July 2015

QSM (Quantitative Software Management) 2014 Research Almanac Published this week!


Over the years I’ve been privileged to have articles included in compendiums with industry thought leaders whose work I’ve admired.  This week, I had another proud moment as my article was featured in the QSM Software Almanac: 2014 Research Edition along with a veritable who’s who of QSM.

This is the first almanac produced by QSM, Inc. since 2006, and it features 200+ pages of relevant research-based articles garnered from the SLIM(R) database of over 10,000 completed and validated software projects.

Download your FREE copy of the 2014 Almanac by clicking this link or the image below.

almanac

Combining Soft Skills and Hard Tools for Better Software


One of the more interesting topics in software development (at least from my perspective) is the culture of the industry.  Seldom does one find an industry burgeoning with linguistics majors, philosophers, artists, engineers (all types – classically trained to self-named), scientists, politicians, and sales people – all working on the same team in the same IT department.

This creates an incredible diversity and richness – and leads to sometimes astounding leaps and bounds in innovation and technological advancement, but it can also create challenges in basic workplace behavior.  This post takes a look at the often overlooked soft skills (empathy, leadership, respect, communication, and other non-technical skills) together with technical competencies as an “opportunity” (aka challenge or obstacle to overcome.)

It was published first on the Project Management Institute (PMI) Knowledge Shelf – recently open to the general non-PMI public.

soft skills

Added bonus here:  I referenced the You Tube 2013 University of Western Australia commencement address by Australian comedian/actor Tim Minchin at the University of Western Australia in 2013 in my post (he shares his 9 recommendations to graduates, my favorite -and the one I quoted – is #7 Define yourself by something you love!)  I believe it’s worth the watch/listen if you need to take a break and just sit back and think about soft skills during your technical day. (Warning to the meek of heart – it’s irreverent, offensive, and IMHO, bang on in his core sentiments.  If you’re offended, I apologize in advance!)

If you’d like a pdf copy of the post above, please leave me a comment with your email address!  (And even if you don’t, I’d love your opinion!)

Have a great week!

Carol

Latest installment of Ask Carol: No matter What… in Project Management, Size Matters


Just wanted to share with you my latest installment on the QSM website blog – My Dear Carol advice column.  Enjoy!

Ask Carol - size matters

Here is the link to the rest of the article:  http://www.qsm.com/blog/2013/ask-carol-no-matter-what-project-management-size-matters

Measurement and IT – Friends or Frenemies ?


I confess, I am a software metrics ‘geek’… but I am not a zealot!  I agree that we desperately need measures to make sense of the what we are doing in software development and to find pockets of excellence (and opportunities for improvement), but it has to be done properly!

Most (process) improvement models, whether they pertain to software and systems or manufacturing or children or people attest to the power of measurement including the CMMI(R) Capability Maturity Model Integration and the SPICE (Software Process Improvement Capability dEtermination) models.

But, we often approach what seems to be a simple concept – “Measure the Work Output and divide it by the Inputs” back asswards (pardon my French!)

Anyone who has been involved with software metrics or function points or CMMI/SPICE gone bad can point to the residual damage of overzealous management (and the supporting consultants) leaving a path of destruction in their wake.  I think that Measurement and IT are sometimes the perfect illustration of the term “Virtual Frenemies” I’ll lay claim to it!) when it comes to poorly designed software metrics programs.  (The concepts can be compatible – but you need proper planning and open-minded participants! Read on…)

Wikipedia (yes, I know it is not the best source!) defines “Frenemy (alternately spelled “frienemy“):

is a portmanteau of “friend” and “enemy” that can refer to either an enemy disguised as a friend or someone who’s both a friend and a rival.[1] The term is used to describe personal, geopolitical, and commercial relationships both among individuals and groups or institutions. The word has appeared in print as early as 1953.

Measurement as a concept can be good. Measure what you want to improve (and measure it objectively, consistently, and then ensure causality can be shown) and improve it.

IT as a concept can be good. Software runs our world and makes life easier. IT’s all good.

The problem comes in when someone (or some team) looks at these two “good” concepts and says, let’s put them together, makes the introduction, and then walks away.  “Be sure to show us good results and where we can do even better!” is the edict.

Left alone to their own devices, measurement can wreak havoc and run roughshod over IT – the wrong things are measured (“just measure it all with source lines of code or FP and see what comes out”), effort is spent measuring those wrong things (“just get the numbers together and we’ll figure out the rest later”), the data doesn’t correlate properly (“now how can we make sense of what we collected”), and misinformation abounds (“just plot what we have, it’s gotta tell us something we can use”).

In the process, the people working diligently (most of the time!) in IT get slammed by data they didn’t participate in collecting, and which often illustrates their “performance” in a detrimental way.  Involvement in the metrics program design, on the part of the teams who will be measured, is often sparse (or an afterthought), yet the teams are expected to embrace measurement and commit to changing whatever practices the resultant metrics analysis says they need to improve.

This happens often when a single measure or metric is used across the board to measure disparate types of work (using function points to measure work that has nothing to do with software development is like using construction square feet to measure landscaping projects!)

Is it any wonder that the software and systems industries are loathe to embrace and take part in the latest “enterprise wide” measurement initiative? Fool me once, shame on you… fool me twice, shame on me.

What is the solution to resolving this “Frenemies” situation between Measurement and IT?  Planning, communication, multiple metrics and a solid approach (don’t bring in the metrics consultants yet!) are the way.

Just because something is not simple to measure does not make it not worth measuring – and measuring properly.

For example, I know of a major initiative where a customer wants to measure the productivity of SAP-related projects to gain an understanding of how the cost per FP tracks on their projects compared to other (dissimilar) software projects and across the industry.

Their suppliers cite that Function Points (a measure of software functionality) does not work well for configurations (this is true), integration work (this is true), and that it can take a lot of effort to collect FP for large SAP implementations (can be true).  However, that does not mean that the productivity cannot be measured at all!  (If all you have is a hammer, everything might look like a nail.)

It will require planning and design effort to arrive at an appropriate measurement approach to equitably and consistently track productivity across these “unique” types of projects. While this is non-trivial, the insight and benefits to the business will far exceed the effort.  Resistance on the part of suppliers to be measured (especially in anticipation of an unfair assessment based on a single metric!) is justified, but a good measurement approach (that will fairly measure the types of effort into different buckets using different measures) is definitely attainable (and desired by the business.)

The results of knowing where and how the money is “invested” in these projects will lead to higher levels of understanding on both sides, and open up discussions about how to better deliver!  The business might even realize where they can improve to make such projects more productive!

Watch my next few posts for further details about how to set up a fair and balanced software measurement program.

What do you think?  Are measurement and IT doomed to be frenemies forever? What is your experience?

Have a good week!
Carol

Walking on Eggshells – a Normal part of Business?


We have all been there – walking on eggshells to avoid outbursts from  a boss, co-worker, or client.  So we skirt the issue, pretend the bad behavior doesn’t exist, ignore the problem, and spend extra time planning how to present an issue so that the person in question doesn’t explode.

While I know that this type of behavior is rampant in business (I’ve experienced it more than once!) – it has serious (and expensive) consequences in the IT industry.  The repercussions stemming from having to “walk on eggshells” to avoid the potential wrath ranges from minor  “oversights” to full scale project failure.

The Challenger disaster is one such failure where group-think and avoidance of conflict ended up costing lives and millions of dollars.  A video chronicling the group think behavior depicts the group-think behavior and steps are taken in companies to address such behaviors. This is all good.

The Walking on Eggshells “syndrome”

Aside from the pressures of group think that encourages teams to conform and cooperate with a single mindset, the “walking on eggshells” syndrome is seldom documented or even discussed.

We all know at least one offender in our workplace!  The offending person may be a narcissist, a control freak, a bully or just plain immature. No matter the clinical diagnosis, our boardrooms and our production labs suffer greatly from their presence.

How much time and money could be saved by confronting these people and addressing the cost of their ‘verbal diarrhea’? Here’s the type of situations I mean:

  • Not raising issues:  “How can we possibly bring up the design flaw for this software now?  The project sponsor will yell and rant if the project is late. Remember how he “freaked out” at the status meeting last month?  Keep going and we’ll address this as an enhancement.”  Cost: could be significant.
  • Cutting corners: “There is no way we can finish the project within the approved budget.  We had no idea that xxx would be so complex, but the steering committee will fire us if we ask for more money. Let’s just cut testing so we can finish the project.”  Cost: could be critical if public safety is at stake.
  • Perpetuating the myth that project plans are right:  “The project plan was based on incomplete data that seemed right six months ago. Now we know more, but with a fixed budget and schedule, we’re stuck. The client will explode if we bring up that the plans were flawed. Let’s just do what we can.” Cost: Corporate learning will never happen (Einstein: insanity is doing the same things over and over and expecting different results.)
  • Skirting accountability: “We messed up on the schedule because we had to program that module 3 times.  I couldn’t understand what they wanted until this time, but we can’t tell Bob because you know how he gets.  I hate it when he yells in a staff meeting.” Cost: could be significant to minor.
  • Wasting time and money: “This project is doomed – the users told us they won’t use the system even if we finish it. The rationale and vision for the project are no longer correct (it’s a dog!) but if we tell the boss about it, you know how she will blame us. I don’t think it is even worth trying to explain it.  Just keep going.”  Cost:  priceless – time and money could be better spent on REAL work!

What a colossal waste of time, money, energy, and morale “walking on eggshells” is to a business!  It is not an easy situation to fix or address – especially when we are talking about people in power who behave badly.

Walking on eggshells should never be part of the way of doing business!  What is the solution?

If YOU were the king of your IT kingdom, what would YOU do?  I’ll publish a summary of responses – add your comments below – or send them privately to me by email to dekkers (at) qualityplustech (dot) com.

Have a productive week!

Carol

Technology at the Speed of Sound… Catch up at Light Speed at LSSC11!


I have to admit that I have not programmed a computer for many years and I have no idea how to write Visual Basic or Java or dot net.  (Yes, I have done raw html and could still manage in SAS if I had to!)

And I can also admit that the proliferation of models and methods from Agile, Scrum, Kanban, Lean, Six Sigma, Xtreme Programming, CMMI(R) (Capability Maturity Model), to manifesto driven systems development,  has me daunted.

Which ones are interchangeable, which are compatible, which are contradictory, and (OMG) where should one start to learn how to avoid the pitfalls?

It’s all a matter of diving in to each and learning the ins-and-outs and best practices — except I know that there are as many opinions about best practices as there are models.

Okay, I can already hear you saying:  “Carol, as a software measurement/Function Point Analysis and PMP (Project Management Professional), you probably don’t need to know much about any of these” — except that I do!

My clients will ask me about these and other emerging topics – and expect that I know the best course of action they should take with Lean/ Kanban/ Agile methods for their company.  And I simply do not have spare time to read all the books, experiment, fail, try again, and then decipher what is hype and what is real in this space!

So what is a sane, intelligent, forward thinking IT professional like me to do when faced with an overwhelming mountain of information like this so I can properly advise my clients?

The answer is: ATTEND the Lean Software & Systems Conference 2011 (#LSSC11)!

There is no other conference in the same timeframe that offers more than this growing conference!

Being held 3-6 May, 2011 at the Hyatt Long Beach in Southern California, this annual conference boasts over 90 speakers over a 3 day period on topics ranging from:

See the full program at http://lssc11.leanssc.org

All given by practitioners and experts for practitioners!  And the majority of presenters have real world, hands-on experience in the trenches with Kanban, Lean, Agile and Scrum and lived to tell about it!

Why not catch up to technology racing at the speed of sound by accelerating your learning Light Speed with LSSC?

There is no similar conference offering the breadth or depth of topics, experience sharing, or real case study results in the Kanban and Lean software development space.

And the May 5, 2011 Brickell Key awards for excellence in the advancement of Lean in software development will recognize some outstanding nominees working in the area – some of which are professionals just like you!

I know that I will catch up on these major topics and more – in record time – by attending LSSC11.  If you are an IT professional, don’t you owe it to yourself to check it out?

Read about the Program and the Speakers at LSSC11 and register today!

Wishing you an eventful and productive week and I hope to see you in Long Beach on May 3, 2011!

Regards,
Carol

Share

There’s no time for “One of these days is none of these days” in IT


Let’s face it

– an investment in software development can be one of the most expensive and medieval timesrisk-laden endeavors a corporation can make, especially when requirements volatility and uncertainty are part of the mix.  In medieval times (of software development this would be pre-2000), customers and developers often jostled over requirements and clashed on issues of flexibility during strict waterfall development.  Then after months of protracted requirements discovery, the construction process would be frequently disrupted by “unanticipated change”.

Fortunately, for everyone involved, innovative techniques flexible to change and offering shorter, more iterative delivery cycles emerged.  Today, a myriad of choices exist that embrace flexibility and deliver high-quality software more effectively and efficiently than ever thanks to agile, XP, scrum, CMMI, project management and other concepts.  Additionally, methods that worked well in other industries have been adapted and tailored for use in IT (six sigma, kaizen, TQM, theory of constraints, just-in-time, Kanban, etc.).

A few of the most fundamental

yet simple concepts of Lean/Kanban/Just-in-Time development (and other methods) include minimizing waste (non-value added activities), removing roadblocks/delays and limiting work-in-progress (WIP).

If we consider software development as a “pipeline” of constant flowing work packets, any unresolved blockages can end up clogging the entire system.  As such, when life intervenes (as it always does) to interrupt or delay the flow of a work packet, approaches such as Kanban  highlight potential blockages before they clog the entire flow.  Teams can then identify remedial actions so that the right work is done at the right time, freeing up resources to concentrate on work where they can be highly productive.  Moreover, while waterfall development commonly fills the pipeline beyond capacity (without realizing it), Kanban limits the input stream to match the available capacity.  Once work is actively in the pipe, a critical delay (“one of these days…”) signals trouble and work can be removed (“none of these days”) until such time as it can be productively worked again or triaged to prevent ongoing delays.  If we are serious about reducing waste (non-value added activity) or delays, IT should never tolerate proverbial “one of these days is none of these days” delays in process.

Unexpected events and delays are a natural part of life (and software development).  The difference with Kanban and JIT is that there is a conscious response to bottlenecks and work stoppage rather than panic or blame reactions (OMG!  What do we do now?) typical of waterfall development.

Can Kanban co-exist with Agile development?

I believe so (and perhaps this is the attitude of a naive believer in both).  Can Kanban co-exist with Waterfall development? Again, I believe the answer is yes.  What do you think?

Don’t forget to register (and tell your management to register!) for the upcoming FREE knowledge webinar on January 18, 2011 featuring

LSSC11 Chairperson, Kanban expert and Author:
David Anderson

on Lean Decision Making, January 18, 2011 from noon – 1 pm Pacific Standard Time (3-4pm Eastern Standard Time). Here’s what this FREE webinar is all about:

Lean thinking assumes it is economically better to build a culture of trust in an organization of empowered individuals to accelerate decision-making and shorten lead times than to mitigate risk with bureaucracy and delayed delivery. Understand the economic model controlling the forces of risk, uncertainty, business value, quality, cost and schedule that affect decision-making and process policy setting. This webinar will show how lean thinking can translate into process and policy decisions more closely aligned with business goals.

Register today to reserve your webinar place!

Why not make 2011 the year that you make a difference in the world of software development by getting up to speed on Lean approaches?  Plan to join us 3-6 May 2011 in Long Beach, CA, USA for the Lean Software and Systems conference (acronym LSSC11) – register today!

Carol
Share

What’s the (function) point of Measurement?


It’s been more than 30 years since “function point analysis”  emerged in IT and yet most of the industry either: a) has never heard of it; b) has a misguided idea of what function points are; or c) was the victim of a botched software measurement program based on function points.

Today I’d simply like to clear up some common misconceptions about what function points are and what they are NOT. Future postings will get into the nuts and bolts of function points and how to use them, this is simply a first starting point.

What’s a function point?

A “function point” (FP) is a unit of measure that can be used to gauge the functional size of a piece of software.  (I published a primer on function points titled: Managing (the Size of) Your Projects – A Project Management Look at Function Points in the Feb 1999 issue of CrossTalk – the Journal of Defense Software Engineering from which I have excerpted here):

“FPs measure the size of a software project’s work output or work product rather than measure technology-laden features such as lines of code (LOC). FPs evaluate the functional user requirements that are supported or delivered by the software. In simplest terms, FPs measure what the software must do from an external, user perspective, irrespective of how the software is constructed. Similar to the way that a building’s square measurement reflects the floor plan size, FPs reflect the size of the  software’s functional user requirements…

However, to know only the square foot size of a building is insufficient to manage a construction project. Obviously, the construction of a 20,000 square-foot airplane hangar will be different from a 20,000 square-foot office building. In the same manner, to know only the FP size of a system is insufficient to manage a system development project: A 2,000 FP client-server financial project will be quite different from a 2,000 FP aircraft avionics project.”

In short function points are an ISO standardized measure that provides an objective number that reflects the size of what the software will do from an external “user” perspective (user is defined as any person, thing, other application software, hardware, department etc – anything that sends of receives data or uses data from the software).  Function points offer a common denominator for comparing different types of software construction whereby cost per FP and effort hours per FP can be determined.  This is similar to cost per square foot or effort per square foot in construction.  However, it is critical to know that function points are only part of what is needed to do proper performance measurement or project estimating.

To read the full article, click on the title Managing (the Size of) Your Projects – A Project Management Look at Function Points.

To your successful projects!

Carol

Carol Dekkers
email: dekkers@qualityplustech.com
http://www.qualityplustech.com/

Carol Dekkers provides realistic, honest, and transparent approaches to software measurement, software estimating, process improvement and scope management.  Call her office (727 393 6048) or email her (dekkers@qualityplustech.com) for a free initial consultation on how to get started to solve your IT project management and development issues.

For more information on northernSCOPE(TM) visit www.fisma.fi (in English pages) and for upcoming training in Tampa, Florida  — April 26-30, 2010, visit www.qualityplustech.com.

Contact Carol to keynote your upcoming event – her style translates technical matters into digestible soundbites, with straightforward and honest advice that works in the real world!
=======Copyright 2010, Carol Dekkers ALL RIGHTS RESERVED =======