Game Theory in Management

by
Modelling Business Decisions and their Consequences

About this Blog

RSS

Recent Posts

Things Change, Remain the Same

“Oh, What I Could Tell You About The Future!”

Relevant Information and Portfolio Management

Stanly T. Raspberry, Consulting Academician

Bad Data Management Dead Giveaways

Things Change, Remain the Same

True story – when I was preparing for my all-day Certified Cost Consultant (as it was then known) examination, I knew that of the four 2-hour tests, two of them would be open-book, and one of those would deal with risk management. Since both my wife and I were MBAs and had retained all of our textbooks, there were several tomes on statistics or quantitative analysis in business to choose from that would provide the distribution tables I needed. The night before the actual exam, though, it had snowed, and snowed a lot. I woke up in a frenzy to get on the road an hour prior to the time I had originally planned to leave, and simply grabbed one of the aforementioned textbooks from that part of a bookshelf as I blitzed out the door.

I felt good about the test as I came home that evening, and thanked my wife for letting me borrow her quantitative analysis in business textbook.

“That’s not mine. I thought it was yours.”

“It’s not mine” I replied. I looked inside the title page – it was published prior to the outbreak of World War II. How it arrived on one of my bookshelves and was placed among our other textbooks is beyond me; but, since I passed all four parts of the test on my first sitting, there was no harm done.

But it did get me to thinking – have the management sciences been so slow to truly advance that someone versed in 70-year-old techniques would not only fit in today’s management environment, but actually advance and thrive, as evidenced by acquired certifications?

Accounting based on the double-entry bookkeeping model has been around since 1494, when Henry VII was King of England, Columbus had just returned to Spain with news of the New World, and Niccolo Machiavelli was 25 years old. Its basic structure has changed little in the intervening 520 years, so I kind of get a kick out of contemplating what Cameron of the McGaughy Clan in 1494 would have received back from his tribe of contributing scribes for ProjectManagement.scroll had he sent them notices asking for their takes on the future of management science back then. “Are’st thou kidding?” I would have quilled. “Once one knowest the wisdom of the balance sheet and profit-and-loss statement, no other business knowledge will be requiredst! Ever!”

I think the two major aspects of the future of the management sciences rest on the following assertions: the technology will advance, but human nature will remain the same.

Most investment houses hire a team (if not an army) of data-savvy analysts (“quants”) who pour over vast amounts of information, repeatedly testing the limits of the accuracy of the saw “correlation is not causation.” Should they find an apparent link, like, say, jellyfish biology to human life expectancy, they will exploit it to its limits for as long as the link is perceived as valid. Like I said, the technology will advance, but human nature will remain the same. It’s why Shakespeare’s works live on as masterpieces while the medical writings of his time – calling for bloodletting and leech-applying – are now considered an embarrassment to the profession.

As for my predictions, I do believe that, eventually, the overextension of the data streams emanating from the general ledger and current risk management theory will be recognized; the GL we will have with us forever (or at least as long as governments need tax revenue, which is the same thing), but the risk managers are vulnerable to having the efficacy of their techniques challenged. Nassim Taleb, in his best-selling book The Black Swan, the Impact of the Highly Improbable (Random House, 2007) makes the case against overuse of Gaussian Curves in business analysis so strongly that I’m surprised that risk management hasn’t seen a significant epistemological retreat since its publishing.

But, until such a correction in the widely-accepted management sciences occurs, it remains a safe prediction that this blog will continue to poke fun at our friends, the accountants and risk managers!

Posted on: December 15, 2014 10:25 PM | Permalink | Comments (1)

“Oh, What I Could Tell You About The Future!”

When Emmitt “Doc” Brown utters this line in Back to the Future III, he’s in a saloon in the old (American) west, circa 1885, talking to patrons who listen to his (correct) descriptions of the future with derision. Brown’s descriptions strike them as absurd.

And why wouldn’t they?  In 1885:

·         The pathogen theory of disease transmission was just becoming widely known.

·         Georgia had more people than California.

·         The Statue of Liberty arrived in New York.

·         The mini-Ice Age, which had started around 1300, had just ended.

·         All glass was hand-blown.

·         Teddy Roosevelt was 28 years old.

·         And the list of stuff we take for granted today that hadn’t even been conceived of is prohibitively long for a blog post.

Granted, Back to the Future III is fiction; but the notion that the way the real future unfolds would strike people in history as absurd, I think, is spot-on. It then stands to reason that, should some time traveler come back from our future to the year 2014, his descriptions would strain credulity. That being the case, what should we make of predictions of the future that strike us as entirely reasonable?

During the aforementioned mini-Ice Age, farmers in Northern Europe had quite the challenge. In order to maximize their crop yields, they had to be able to time their ploughing and planting in such a way as to keep their seedlings from freezing (should they plant too soon), but also provide enough time for their crops to ripen before the first freeze of the next Autumn. Correctly timing their crop cycles was literally a life-and-death decision for them and their loved ones. Despite being blessed with no access to climatologists, they still needed to be able to base their decisions on something more than the cycles they had observed in the immediate past. Under a belief that wild animals would often act differently when the last freeze had been encountered, an entire structure of data collection based on such observations became the basis for these critical decisions. Sprinkle in a bit of formality, and a couple of hundred years, and you have the spectacle of men dressed in formal attire gathering on February 2 in Punxsutawney, Pennsylvania, to pull from its den an unwilling member of Marmota momax to “predict” either six more weeks of winter, or an “early” spring.

Now, imagine if I had left out the description of the medieval Northern European farmers’ need to somehow divine the planting season, and also imagine you had never heard of the celebration of Groundhog Day. The question virtually asks itself: is this any way to predict the future?

I suppose it’s appropriate for the December ProjectManagement.com theme to be the Future of Project Management, since every other writer for every other publication uses this time to make predictions. But that’s my point – any attempt to predict the future is simply an extension of our own prejudices and recent past into an unknowable environment. I could predict the outlandish, that someone will come up with a profound breakthrough in the management sciences, but then be hit by falling orbital debris while on her way to publish the findings. Or I could be safe, and predict that the presentations from the next year’s management seminars will be evenly split between those espousing the amazing success of their own particular projects, made possible, don’t you know, by their own insightful embracing of current management techniques, and those who re-package the basics in new presentations, while retaining the eat-your-peas-style that so pervades such presentations.

Or, I could be perfectly honest, and admit: I have no idea what the future of project management looks like.

And my readers should hold as suspect anyone who claims to the contrary.

Posted on: December 07, 2014 09:15 PM | Permalink | Comments (2)

Relevant Information and Portfolio Management

Not all information is created equal.

A stock broker or securities trader who could time-travel and retrieve next week’s Wall Street Journal would become (almost automatically) rich. However, going into the future and returning the New York Times’ horoscope section is clearly a waste of time (travelling). Of course, most managers do not have access to time machines; rather, they depend on personnel both within and outside the project team to gather the necessary data, process it into information, and deliver the information in such a way that it is usable to the circumstances before them. Since these people’s time and energy is not limitless, the information streams that are to be set up and maintained must be carefully selected.  It is the very care that must be exercised when selecting such information streams that leads to a consulting environment filled with quacks. Here are a couple of examples.

Many accountants insist that project cost performance can be captured by comparing actual costs to budgets. When this folly is pointed out, these will often attempt justification by asserting that the analysis should be carried out at the line-item level of the basis of estimate (BoE). Yet, the flaw in this “analysis” is quickly and clearly revealed when one considers the following scenario.

A project team begins work on a project that had been estimated with $25K (USD) for heavy equipment, and $75K in labor. However, at project completion, they actually ended up spending $70K on equipment, and $30K in labor. The budget – vs. – actual comparison (at the elemental level of the BoE, don’t you know) would be raising red flags over and over, never mind that the project would come in on-budget. Unwilling to admit that the generation of false poor performance warnings ought to be an automatic disqualifier among competing management information systems, the charlatans pushing this analysis, in my experience, do the exact opposite: they insist on even more rigor, finer levels of detail in the compared data sets. If they were turn-of-the-last-century snake oil salesmen, they would insist on the freshness of the snake as a determining factor in the efficacy of their products.

I found myself in the end-hours of a training session with a software company that published a critical-path methodology package that had recently branched out into portfolio management. The difference between their CPM software and this portfolio management package was – I swear I am not making this up – having the CPM platform incorporate some capabilities from the general ledger, like timesheets and travel expenses. I had an opportunity over lunch to talk about this new “capability” with some of this company’s principals.

“Look,” I started, “simply pilfering some capabilities from the general ledger is not what makes for a portfolio management system. You’re better off leaving asset management to the GL, expanding your project management capability to better accommodate earned value, and then create a module that captures strategic information. Only after you can set up a structure that enables all three of these to interact in a coherent fashion can you make a serious case for actually performing portfolio management.” (Such a structure is examined at length in my most recent must-have book, Game Theory in Management.)

I happened to be sitting directly across the table from this company’s then-president, with his veeps on either side, slightly behind him. I could almost see the light bulbs going off above their heads, but the president was having nothing of it. He began describe how much effort his company had poured into the software in the configuration it which it had been released, and that he would not go back and attempt such a correction (our friends, the accountants, would readily recognize such a response as also being invalid, being taught, as they are, to never use the “sunk cost” argument when furthering a business decision).

The problem with competing information systems, some claiming to provide essential “portfolio management” info, is that such claims are next to impossible to verify. Only the manager familiar with both MIS architecture and basic epistemology can consistently navigate these waters.

Or else the readers of ProjectManagement.com’s blog entries.

Posted on: November 30, 2014 08:18 PM | Permalink | Comments (0)

Stanly T. Raspberry, Consulting Academician

I walked into the business college classroom full of young, serious faces, just as the professor announced “Class, I have a real treat for you today. My former student, Stanly T. Raspberry, who has made a living as a project management investigator since 1998, will be participating in our round-table discussion.”

As my old professor, Noah Tall, strode over to shake my hand, he continued “Stan, these are my graduate students in project management, the ones who either are working as interns for a major company, or else will be in the near future. I’m sure you have a lot to tell them about the real world implementation of the concepts we’ve been covering.”

While I took a seat at the table near the front of the room, I replied “Happy to be here, Noah, though, to be sure, some of the things I’ve learned since graduation might take some of your students by surprise.”

“Well, that’s why I wanted you here. Now, who has a question for Mr. Raspberry?”

One student in the second row held up his hand.

“Mr. Raspberry, what do you consider to be the more important analysis for deciding on which projects to bid, the risk analysis, or the return on investment?”

“Neither is truly relevant” I replied.

Gasps erupted from the room.

“What do you mean, not ‘relevant’?” the original questioner persisted, his voice rising with agitation. Both ROI and the Risk Analysis are key components to portfolio management!”

“No, not really” I deadpanned. “Return on Investment might be useful when deciding on whether or not to procure an asset, and Risk Analysis might tell you when to buy insurance, and how much. But as far as choosing which projects to bid, you have to know what the potential project represents to the market you are targeting, as well as what your competition is doing with respect to gaining or losing ground to you in that market. Neither ROI nor risk analysis can return anything useful for those decisions.”

Looks of bewilderment flooded the faces of the students. Another student held up her hand.

“Mr. Raspberry, is it safer to begin re-assigning project team members when the project is 85% spent, or is it better to wait until after it is 90% spent?”

“Percent spent isn’t relevant to that decision.”

Again, gasps and looks of incredulousness spread through the room.

“Well, what would you base that decision on?” the student asked snarkily.

“You have to know the percent complete of the current project, as well as the extent that the other projects in the portfolio that the peeled-off team members will be assigned to can absorb potential schedule delays.”

“Percent spent, percent complete – what’s the diff?”

“The ‘diff,’ as you put it, is that one is a marginally relevant piece of accounting information, and the other is a vital information element in making informed project decisions, including but certainly not limited to, when to reassign staff.”

Eye rolls and pshaaws replaced gasps and looks of incredulity. From the back of the room, another young man held up his hand.

“What would you consider the best way to prioritize a milestone list? Should it be sorted by earliest-to-latest, by which milestones are accomplished by key personnel, or by which ones are the most expensive?”

“Milestone lists aren’t relevant to scheduled activity priorities. You have to load activities and their durations into a Critical Path Method-capable software platform, and then add in the schedule logic. Have the computer perform a forward pass and backward pass, and return the critical path and levels of float in the non-critical activities. Then you can make an informed decision about task priority, but not until.”

Mocking laughter replaced the eyerolls and pshaaws.

“Listen, people” I began. “There are a lot of analysis techniques out there, but not all of them generate relevant information. In fact, many of them are positively useless, even though they may have been sold to you as vital.”

“Professor Tall, did you say this was one of your students?” a student in the front row demanded.

“Well, yeah, but I didn’t say he was one of my better ones.”

I gave Noah a look, as if to say “Really?

“Professor Tall” I started, “you said these students were all interns of a major company. May I ask which company?”

“Why, Monolithic Corporation, of course” Professor Tall said pompously, with the students beaming with confident pride.

“I should have known…”

Posted on: November 24, 2014 09:52 PM | Permalink | Comments (0)

Bad Data Management Dead Giveaways

In most professions there are clear indicators when those claiming expertise show themselves to be, in fact, rather inept. An accountant who doesn’t know how to close out a contra account, the project manager who doesn’t know what a cost variance is (no, it’s NOT the difference between budget and actual costs. If you thought so, are you sure this is the website you should be perusing?), or the strategic manager who isn’t aware of their organization’s market share must keep such ignorance secret, or else get educated immediately, les they reveal themselves as doofuses.

The same is true of data managers, or, if your organization has one, Chief Information Officers. Much silliness can enter into the realm of how managing large amounts of data ought to occur, and if those in charge are not, in fact, advanced in their capability, then the entire organization is in the lurch. So, how does one determine if the keepers of the big data know what they’re doing, but in a non-intrusive way? It’s easy – just listen to what those people have to say, and if they commit any of the following follies, be afraid, be very afraid.

The easiest such test might be if the CIO knows the meaning of the word “epistemology.” This word rarely comes up in your normal project team meeting. It refers to the study of knowledge and its limits and, by extension, the nature of information efficacy. If your typical PM or strategic manager (never mind the accountants) don’t know the term, it’s probably no big deal. But if your CIO doesn’t know it, they are most likely deficient in the expertise they need for their position.

The next most obvious indicator is if those in charge of managing, creating, maintaining, or modifying management information systems ever – ever – use the argument “why wouldn’t you want to know that?” Management information systems require time, energy, and resources to set up and operate. If there is no specific demand for a particular information set, then setting up the systems to deliver such information is a waste of those very resources, energies, and time. Even entry-level MIS practitioners ought to know this.

Another clear indicator of bad data management practices infiltrating your organization involves any push to consolidate information streams into a common software platform by invoking the efficiencies of scale argument. In those instances where a variety of home-grown systems have been developed, some of which perform overlapping functions, it’s almost always due to a diversity of needs among the consumers of the information. Unless there’s a clear and compelling reason to force all such information systems onto a single platform (e.g., the general ledger clearly shouldn’t be split), any attempt to do so will come across as heavy-handed and meddlesome. Invoking supposed savings in training costs while managers are in danger of losing the relevant info they need to manage is an unmistakable sign of CIO chicanery.

Also, if your Board of Directors is being informed of goings-on within the organization via an action item list, milestone list, or “performance item” list, then the person(s) supplying this so-called information don’t know what they are talking about. Polls are not legitimate management information systems. Legit MISs have three distinct phases: (1) data is gathered based on a certain discipline or criteria, (2) it is processed into information based on some methodology (e.g., Earned Value or Critical Path), and (3) the information is delivered to decision-makers in such a way that they can readily use it. A poll, on the other hand, is just what the last person who entered data into a central database thought of a particular issue. They’re worthless, and real CIOs would know that.

Finally, authentic big-data managers know that accountants and risk managers want to take over the management information universe, and know how to put such usurpers in their places.

Posted on: November 16, 2014 08:37 PM | Permalink | Comments (0)
ADVERTISEMENTS
ADVERTISEMENT

Sponsors