From software development to construction to logistics and finance, every company has projects that need planning, managing and monitoring. But the tools we use to do that are often complex, designed for specialists and don’t do as much as they could to warn about potential problems. Could AI-powered decision support systems and automation make more of your projects successful by reducing costs and mistakes, analyzing risks, making things more efficient or keeping things on time and on budget?
Here is an early look at how artificial intelligence, machine learning and predictive analytics could affect project outcomes in the years to come.
Thinking about risk
Managing a project well takes more than just making a great plan in advance and sticking to it. Interdependencies within your project and external changes make outcomes unpredictable. Estimates and many forecasts are at best intuition; at worst, guesses and handwaving. Modern management techniques such as agile and continuous delivery aim to reduce uncertainty by working incrementally, but that still doesn’t guarantee final delivery. Portfolio management selects a mix of projects that balance risk and reward (because it’s hard to stay competitive if you only play it safe), but that means assessing risk accurately, which is hard.
We’re also prone to what he calls “hope-based planning.”
“It's natural: We're to some degree all optimistic; we all see the positive path forward, the way this could work, and we don't have evidence to prove it can't work, so we hope it's going to go the way we want it to,” Heintz says.
Aptage uses machine learning to predict the outcomes of projects using data you already have, such as the planned start and end date of various phases of the project (and, if you have them, estimates about any backlogs) to learn the completion rate of the team and predict the likelihood of delivering on time. Estimates are always uncertain, so you can put in upper and lower bounds for how long tasks will take (or the software can model it using the golden ratio). You also need to put in some information on the source of risks: “Don't just blame the last person holding the problem; figure out what's going wrong,” as Heintz puts it.
That’s information most teams will have, he suggests. “Teams that don't have a tremendously rigorous process can still use our tool right away. If a team has a backlog that are seven things written on a napkin, we can still give them some help. If a team has a complete best case/worse case analysis and work breakdown structure for the whole project, we can give even more advice,” Heintz says.
Aptage uses visualisations of confidence, feasibility and whether the risk is going up or down over time to help you switch between what Heintz calls fast and slow thinking. “We had to create these visuals because we need to connect with the fast-thinking, intuitive mind to help people see things in a way that lets them make good intuitive decisions. If the project starts getting a whole lot more red, the lizard brain should have some fear. Maybe we still decide to go forward with that project, but we’ve thought about it and we’ve been triggered to think about the right things. ‘That might hurt but we have a safety net; if we have to spend 20% more on this project, we'll still have a good probability [of success]; let's take the risk.’”
The algorithms and models Aptage uses were designed for software development but also fit construction projects. The first (rather basic) integration is with Jira and in time Heintz hopes to put the visualisations inside tools project teams use on a daily basis such as Microsoft Project, Primavera construction planning, Trello, or even feature roadmaps and KPI dashboards in Salesforce or Power BI. “If I put a task into Trello or ServiceNow, when might I expect and with what level of confidence is the estimate? Give me the 90 percent confidence date.”
Generally, the goal is to avoid getting to the end of a project and being surprised. “If you knew today that the project was at 60 percent risk of not being done and done well, what would you do differently? We point you to the source of the risk and what you might do about it.”
Aptage won’t fix problem projects but it should warn you about them, Heintz says. “It’s the ability to collaborate and talk about ‘This is a high-risk project and we’re managing it well’ versus ‘We don’t know what the risk is; we’re just going to promise it’s going to be done and at the last minute we might say no.’”
Some of the AI tools businesses are already adopting, such as predictive maintenance, can help make projects more efficient and reliable, says Lance Olsen, director in Microsoft’s Cloud AI team. “One of the most common things that can throw a project out of whack in terms of its schedule and risk is unforeseen failure in systems that you’re relying on in the project.”
In general, he believes that AI will be most helpful by removing risk in projects, “whether that’s prediction for the project up front or removing risk in the execution.” That’s going to make projects more efficient: “There’s so much uncertainty and how we deal with it now is that we create giant buffers,” he says.
AI can already help track progress and performance, especially if you take a broad view of what project management is, suggests Nadya Duke Boone, director of product management for platform at New Relic, which recently added what it calls applied intelligence to its performance monitoring tools. “A lot of project management is going on where there’s nobody who officially has that title or role and there's a new set of project management tools we don't think of consciously that way.”
With incremental projects, success isn’t always determined at the end of a project; it’s more likely to be on-going KPIs around quality and reliability. “Customers asking ‘Are we on track?’ are using metrics like how many deployments can I do? How reliable are my deployments? Am I having regression errors? Is my performance where I want it to be?” Boone says. “AI could identify slowly ramping trends in that stream of data that are significant but hard to see — or easy for humans to ignore even if they see them.”
New Relic Applied Intelligence includes a tool called Radar that looks for these patterns and issues, suggests actions and learns from what users pick up on. “It’s helping us find things like a week-over-week slowdown; something trending in the wrong direction where we wouldn’t have caught it because the project is so focused on day to day,” she says.
AI, Boone suggests, could help make sure we pay attention to bad news. “I've known cases where metrics have shown things are trending in the wrong direction but it's hard for people to acknowledge that.”
It can also help with automation, leaving project managers more time to actually manage. “A large part of AI in any industry right now is removing the work that's tedious and letting the humans focus on the part machines don't do well. So much work in project management is not number crunching; it’s about do we have clear goals, is everyone moving in the same direction and is their work coordinated.”
New Relic, for example, uses a chatbot to remind managers to manage capitalizable hours. Boone also speculates that natural language analysis of how people phrase status updates might help to determine how confident they are about progress.
“Reducing monotonous and time-consuming tasks that aren’t necessarily high value but show up in every project” doesn’t just free up time, Olsen points out; it also reduces errors.
Rick McEachern, vice president of business at development at Software AG, sees robotic process automation (RPA) taking over a lot of the mundane, repetitive, high-volume tasks from project managers, like merging data from different systems to coordinate deliveries and other logistics and updating case management systems. “There's a ton of work that could be done as far as transferring data, moving it around between different systems, handling mass emails, doing reporting and file and document processing,” McEachern says. “Robots are great at that sort of activity.”
“You could have robots looking at different updates and status reports and data and doing alerting if a file was supposed to be here on a certain date and it didn't arrive,” McEachern suggests. “If someone hasn't submitted their latest estimate, you might use a robot with the case management system to ping them to say, ‘It's two days away from deadline and I'm going to send you an hourly reminder’. And when they upload the new schedule, I can invoke a robot to extract the data I care about and put it into the master project schedule.”
Predicting and experimenting
RPA could also be useful for resource optimization and scheduling, if you can define business rules or create data models that they can use to evaluate and report exceptions. That could be particularly useful for transportation and logistics, he said; “There are lots of different optimizations where you can use machine learning, like least cost routing for minimizing fuel costs or optimizing loads.”
“You can use forecasting as a way to fine-tune project execution and have fewer misses,” Olsen says, but if you really want to use AI to improve projects you have to look for ways to experiment and improve. “These kinds of practices are going to be what really differentiate organizations in the next five years. These are systems of intelligence and CIOs have to design these to drive the rate of experimentation.”
“Part of that loop is saying, ‘We got 75 percent accuracy last month; what about the other 25 percent? Maybe weather is a predictor for us; let’s get a weather feed and add it to the model and maybe we get to 80 percent.’ Look at your rate of experimentation and your rate of learning as a key metric for success on your projects. How are you being systematic about getting that learning and driving more experiments?” Olsen says.
It’s tempting to think that you could use machine learning to make forecasts about which projects will succeed and fail, but that could be a long way off, he warns.
“What we see people focusing in today is components within the project; the resources and progress against the resources, or the health or performance of resources. As that matures, a logical next step is to go up a level and what about the whole project itself — how can we make that more efficient?” That will mean gathering detailed information about a lot of projects, he warns.
“To get predictions about projects you have to be capturing data about the project itself and feeding that into a model and saying what were the anomalies or common traits in the project that made it successful?” Olsen says.
Boone also sounds a note of caution on how accurate predications can be, noting that some project tasks are easier to estimate than others because they’re more repetitious. When working as a project manager she found experienced electrical engineering estimators were very accurate, and not just because of how long the construction industry has had to build up knowledge.
“The difference between engineering and building software is that we don’t know the unit of work; in engineering that's a foot of conduit or a yard of concrete but we just don't have that in software. Laying asphalt is laying asphalt no matter where and when you’re doing it but even adding a column to a database can be very different depending on where you are in the project and who is doing it,” Boone says.
For large, complex projects with a lot of staff where the final goal is something repeatable, like setting up a new data center or moving applications to a container platform, Boone believes there could be enough data for machine learning to identify outliers, anomalies or correlations. “Here are three interesting correlations we’ve found; you might want to dig into them. That’s treating AI as a partner to the project manager and letting them apply their human-level and emotional intelligence.”