The human factor in project management resource planning: Thomas Schlereth dealt with this in Part 1 of our series on resource management. In the continuation, it is, quite contrary, about the "colleague computer" and how it supports the PM. But the human effect also plays a role in project management software, as we will see ...
Planning of any kind is complex, difficult to predict, volatile and thus almost luck. So it is obvious to use a computer and its software for this task, because the task is extensive, requires a lot of data and thus a lot of calculations.
The first solution approach is about simple visualization. The planning and its updating is completely done by a human being and visualized in a computer program. Perhaps the simplest mathematical operations such as summation or conditional formatting (in the Excel world) are still inserted. This is not relevant for planning, because the actual content is completely done by humans, except for a simple sum per period. The only advantage of this planning is its comprehensibility (for humans) through the visual representation.
Sounds simple, it is and is also very common: Everybody has seen capacity planning with Excel® in a company. Each employee is shown in a row, and the columns represent weeks or months. Project planning then enters into the columns how many hours or percent of the employees available time is utilized. The quality of the value in this cell depends on the skills of the person typing it in: There is no mathematical basis for this number. It is an estimate.
In a further stage of computer-aided planning, no simple cross tables are filled in, but known data are recorded in data records (in databases). From these data records, real computer programs can then generate different views of the project in tabular or graphical form. So the data is always assembled and presented differently - but it still remains data. It is not information that motivates action. This is left to the person who interprets the data and draws the right conclusions for action. The pure visualization machine now becomes a data storage and visualization machine.
The assumption here is: If as many people as possible enter as much data as possible into this central database, the possible insights and resulting positive actions will always improve. This actually always turns out to be a mistake, because: "A lot of data remains a lot of data."
Algorithms transform data into information
Some aspects for the design of helping algorithms shall be emphasized here.
The first point is the difficult predictability of the dimension of a future job. In the relatively simple environment of a factory or vacation planning this is easy. The vacation ends in terms of dates just when the vacation is over. The quality of the vacation plays only a subordinate planning role.
A new design step in mechanical engineering, the development of a computer program or the predictability of when management will make a decision is a greater challenge here. We have to admit that there are things we cannot predict, at least not accurately.
But we are not completely blind or ignorant! We may not be able to predict exactly the work, the duration and therefore the time required and the workload - but approximately. We may not know exactly how long it takes to develop a computer program, but we do know that it doesn't happen in a minute. And we also know, on the other hand, that it will not take 10,000 years.
So we can estimate an interval and therefore an imprecise value for start, end, duration and effort. So in planning, we don't say "this will take exactly 46 hours," but we estimate " 30 hours to 60 hours." This is the interval.
For the computer, i.e. the software that is supposed to process this, it is not so easy. There is not only one value to be calculated (46 hours), but several (30, 31, 32, 33 etc.). Due to this imprecise specification, dates for the end of the work as well as for the start of the following work also become imprecise in a interlinked planning. Thus, the start is not exactly on 06.03, but occurs sometime between 06.03 and 07.01.
This creates a dangerous combination for the algorithm. The work package can start on 06.03 and last 30 days - or on 06.04 or on 06.04 and last 30 days, or 31 days and so on.
So the combinations are multiplying. If there are now several dependencies in one project plan, even with other work in other projects, it can quickly come to exponential combinatorial increases, which are sooner or later devastating for the machine.
Carl Friedrich Gauss helps once again
The scenario described above assumes an equivalent probability of the estimation interval. That is, for a simple work package estimated with a duration of five to 10 days, each case, i.e. 5 or 6 or 7 etc., is equally likely. This is not the case in practice. The probability that a computer program will take one minute to develop is as small as the probability that it will take 1,000 years. It is the same with the work package. In reality, 5 days and 10 days are not very realistic. The average value, i.e. 7.5 days, is more probable.
It is also possible that the real duration is outside the interval, for example 3 days or 12 days. This could be extended almost infinitely, up to the case that the work package is not done at all (zero days) or runs into eternity.
However, these considerations only lead to even greater complexity and can be ignored.
In the case of so-called demands, meaning entire projects in the future, which have not been clearly defined, the scenario of "not happening at all" can occur. Here, a basic probability of occurrence is used in a simulation, i.e.: "How likely is it that the project will be realized at all? The possibilities here are between 0% (guaranteed not to be realized at all) and 100%.
In this case, the Gaussian standard normal distribution would measure each interval of a job between the planning (estimate) and the reality (how long it really took) and derive a confidence level. Thus, in a central computer system, all work packages that were first recorded as taking 5-10 days would be compared and measured against the reality that occurred.
It might turn out that most of the packages with these parameters actually took 7.5 days to complete. This allows a conclusion to be drawn about all future 5-10 day packages, namely that they are likely to take 7.5 days. However, it may just happen that the most common case is not 7.5 days, but 9 days.
Statistically, this is all logical and can be applied to comparable intervals (interval is always 50% of the minimum value, etc.). But is it true then?
No, it is an approximate value that is significantly influenced by the people who estimate and work. There are people who estimate optimistically and people who rather play it safe. They may think 5 - 10 days, but tell the project planner 8 - 10 days or 10 - 15 days (or whatever). The same principle applies to planners. An effort estimate from an employee is readily presented as "overstated" and scheduled shorter or lower. The employee knows this at some point and intentionally estimates a higher value since the "target" will be cut anyway. Thus, the planned value is less purely technical or empirical, but more due to human feelings.
However modern simulation software can certainly handle such models. Multiple, combinatorial simulation has proven to be the best way. Each case of the interval is simply calculated. The procedure is simplified: Once it is assumed that the package lasts 5 days. Is there a problem (yes or no)? Subsequently, everything is calculated again, this time with 6 days. Again it is checked: Is there a problem? - yes or no. And so on.
Each problem found is then weighted with its probability. In our example, the problem with 5 days somewhat less than with 7 or 8 days. This does not give the expected duration (that would be a coincidence), but a percentage risk between 0% (all cases work) and 100% (no case works). The result is information for the project manager to evaluate. With a risk of 75%, he also takes into account the reliability of the person, the work itself and other environmental influences that the computer cannot know. Or he simply takes the risk.
This results in an ingenious solution that combines mathematics with people's experience and perception. Unfortunately, there is still one problem: the sheer volume.
We all learned about the R0 value during the Corona pandemic. And we also caught up on exponential growth from school (although everyone knows the example of the grain of rice and the chessboard).
This can also happen in the above scenario - in resource planning. If two work packages are inaccurately planned in parallel with the same resource, all conceivable combinations must be calculated. Let's take the following scenario (as a rather difficult mental exercise):
A package starts sometime in week 23 (yes, even the start of a work or project can be inaccurate) and lasts 5 - 10 days. Mr. Smith is scheduled to spend 30% of his time on it.
The parallel ongoing package starts between week 23 and week 25 and lasts 4 to 8 weeks. Mr. Smith is scheduled for 80% of his time in this package. Can Mr. Smith manage both packages? The answer is yes, he can organize his work in these two packages in such a way that he manages both jobs on time and without being overloaded. Admittedly, I had Can Do calculate this for me, a manual calculation would be too time-consuming.
And these are just two pacts with one resource. Let's imagine a portfolio with 200 projects. All of them are imprecise and whole departments are planned (not the people). At the latest now it becomes really a lot of work for the PM, and resource planning becomes complicated.
Here a computer program helps, with which the planner can enter the values inaccurately. The computer calculates all possible combinations and uses them to determine a "probable" risk of a problem, such as a deadline violation, which can range from 0% (nothing is going to happen guaranteed) to 100% (will go wrong). For estimates between 40% and 60%, the information obtained is not clear for the planner. Here, the human (or artificial intelligence) must continue to decide for himself whether it is worth intervening or not.
Nobody can foresee the future exactly and the further into the future one "looks", the more inaccurate this planning becomes. So it is absolutely logical that the planning software allows for inaccurate values and thus performs mathematical probability calculations that determine information for the planner from data. However, this is only a "horizontal" planning approach, because predictability depends not only on the time perspective, but also on the precision of the planning. So how small-scale does the planner want to look into the future?
Digression: deepening the human factor in estimates of the future
It is obvious: The human effect in the scenarios mentioned must not be underestimated. All those involved in such planning have vested interests that are completely different.
The overriding goal of realizing a great project within the framework conditions is often not the primary motivation of the acting persons. Here, the error culture in the company, the experience and professionalism of the project managers play essential roles. Specialists in projects sometimes simply don't care what the overall goal of the project is and what the benefit is for the company and its shareholders.
Managers also often don't care about the pressure employees have to endure in projects (or any work). He just doesn't want to have any trouble with higher management or simply score positive points there. Basically, there are two poles in this situation: The superior project manager wants to implement the project with as little effort, cost and time as possible. The employee in the project wants to complete his task on time, but above all in the right way, without being under constant stress.
Therefore, there are good and bad project managers, just as there are good and bad project employees. The assessment is a question of perspective. The scenario described above and the considerations listed here can be smoothed out somewhat by the software. The fact that the personal estimate of the employee (who is often also the person who performs the work) can be directly incorporated into the planning process creates something like trust. The employee's ignorance or refusal of an absolute binding commitment (pressure) is also taken into account in the planning. The employee is therefore taken seriously.
People often say that inaccurate planning is closer to reality. This is certainly true, but above all they create an improved foundation of trust between the acting actors. Due to the risk determination described above and its probability, the personal assessment of managers and employees in the current project becomes important again. If the risk is 75%, the (good) project manager will draw the employee's attention to this and ask him "Can you still do it?
How this complex software calculates the risk is not really important for the persons. The machine points much more to a situation early on and causes a human assessment of the people in this game. This demonstrably raises the quality of the project and the interaction of the people.
To believe that accurate planning in a complex world occurs because some important manager has entered a desired value in some Excel® cell is no longer up-to-date. This factory-chord thinking of an all-knowing manager on the one hand and subordinates on the other, driven by unconditional obedience and the will to fulfill the specification, cannot work in a global economy characterized by a shortage of skilled workers.
But the pure, free relaxed life of highly qualified specialists does not correspond to reality either. Sales, targets and results must be delivered in a global economy characterized by competition and speed, otherwise a business ends quickly because the market does not buy "old" products. And if no products or services are sold, no money comes into the cash register, from which the salaries of the specialists are ultimately paid.
With a high probability, the truth lies in the middle, whereby we would be at Carl Friedrich Gauss again with his standard normal distribution.
Admittedly: This installment of our series on resource management came across as a bit theoretical and math-like. But the next part will be more practical, when we discuss the different ways of project planning.
You already want to know everything about hybrid project management, Can Do and resource management under human aspects? Let us advise you without obligation - just get in touch!
Overview of our blog post series:
- Part 1: Is this chaos really necessary? Planning of people
- Part 2: What can PM software do? (this post)
- Part 3: Types of Project planning
- Part 4: The individual in resource management
- Part 5: The evaluation of problems in projects