Innovation and Enterprise at the University of Nottingham

The Haydn Green Institute (HGI) at Nottingham University Business School is among Europe’s leading centres for enterprise education and a focus for entrepreneurial skills development. Simon Mosey, is Professor of  Entrepreneurship & Innovation and Director of the Centre. He is editor of the Journal of Technology Transfer and his research interests include technology entrepreneurship, entrepreneurship education and innovation management.

He is co-author of the popular books, Ingenuity and Ingenuity in Practice and has published his research within leading academic journals such as Entrepreneurship Theory & Practice, Research Policy, Journal of Management Studies, Academy of Management Learning & Education and periodicals such as the Washington Post and the Financial Times.

Simon has been working with Derbyshire Constabulary’s Innovation Network, to help stimulate ideas and ways to innovate. EMPAC is grateful to Simon  and his colleague, Senior Researcher Paul Kirkham at HGI, for sending us an entertaining tour of leading-edge thinking about innovation – have a read!

 

The curious case of the hurricane and the Strawberry Pop-Tart

 

“Innovate! Don’t work harder – work smarter!” We’ve all heard this sort of demand, usually without any explanation of how to carry it out. It’s often accompanied by an undercurrent of “more for less”, which has a suspicious subtext that makes us question whether innovation represents an opportunity to bring about improvement or an excuse to cut spending.

So the waters are already muddy. And the situation isn’t helped by the fact that there’s scant understanding of what innovation is, how it works, how it doesn’t work and why change can be so difficult. A little explanation is therefore in order.

We can divide innovation into two types. The first is incremental innovation, which is about doing things better – typically by cutting waste and enhancing efficiency. The second is radical innovation, which is about doing things differently – typically by doing something atypical!

The thing about radical innovation is that it usually comes from outside existing practice. This is why it’s so disruptive and sometimes – especially within large organisations – distrusted. By way of illustration, let’s consider the notion of predictive policing.

The prediction business

Every industry is under tremendous pressure to reduce costs, and one very effective method has been just-in-time logistics – delivering goods as and when needed and not a minute before. A desire to avoid being caught out by sudden demand has propelled the retail sector – particularly those elements that pile it high and sell it cheap – into the prediction business.

In the USA supermarkets such as Wal-Mart have this down to a fine art. When a hurricane is predicted, for example, supplies of bottled water and duct tape will reach stores well before the storm.

So far so obvious, but these stores will also make sure that enough strawberry-flavoured Pop-Tarts are available. This precaution is based on past experience: the strawberry-flavoured Pop-Tart turns out to be the product of choice when it comes to laying in provisions. Exactly why doesn’t matter. The supply system knows demand will peak and acts accordingly.

The value of prescience was brought into sharp focus when Hurricane Katrina struck New Orleans in 2005. Amid the disastrous aftermath, the local authorities suffered intense criticism, much of it from neo-cons whose agenda of “private is good, public is bad” was well served by revelations that some police officers had bought ammunition from local supermarkets because official supply chains – and, indeed, the whole system – couldn’t cope.

It’s tempting to watch such events unfold from Britain and tell ourselves: “Well, that’s all a long way from home.” Yet the point has some validity everywhere: are state institutions as receptive to new ways of thinking as they could be?

Analysis versus experience

It’s directly out of the sort of technological analysis discussed above that predictive policing first arose. The argument behind such initiatives is that criminal behaviours can be analysed far more thoroughly by computer than by police officers relying on experience and intuition.

Once you understand that burglars act out the same behaviours as other “foragers”, say the idea’s proponents, you can start to get inside their heads. According to the LAPD, a notable early adopter, entering “the decision cycle of our adversaries – drug dealers, gang members, terrorists – affords unique opportunities for prevention, thwarting and information-based response, ideally preventing crime”.

There’s some evidence that this actually works in practice. A 2011 trial in the Trafford district of Manchester resulted in a significant fall in burglaries, with the Home Office calculating that potential victims had been saved more than £1 million.

Of course, this was back in the days before “big data” and “algorithms” entered common parlance – back before the major players on the internet finally settled on a business model of using mass surveillance to target advertising. Since then big data and algorithms have had mixed success, with some issues very much still to be resolved.

One example worth looking at is the prediction of flu epidemics. Here the idea is that anyone who begins to feel sick will search the internet for symptoms well before even heading to the pharmacy to get some paracetamol.

It was with such a scenario in mind that Google Flu Trends was launched amid much fanfare in 2008 – only to be quietly put to sleep just a few years later. The trouble was that neither the data producers – the poorly surfers – nor the algorithms understood flu well enough to make the concept work. So should we dump data analysis and just trust the experts?

Aberrations and intuition

Perhaps we need a better understanding of statistics to answer this question. Speed cameras may prove helpful in this regard.

The standard procedure is to install a new camera at the site of a serious crash, the goal being to reduce the likelihood of further accidents. This seems to work – but what about the statistical phenomenon known as “regression to the mean”? What if the original accident was merely an aberration and the situation has returned to normal, as it would have done irrespective of a camera’s presence?

It’s puzzles like this that lead to understandable complaints that algorithms undermine hard-won professional expertise and intuition. But to assess the merits of such arguments we first need to appreciate what intuition actually is.

Let’s take the starting point that it’s not magic! After all, when asked for his “gut thinking”, the great American scientist Carl Sagan usually remarked that he preferred to think with his brain. Essentially, intuition is when someone knows something without being able to give chapter and verse about exactly why.

In his 2011 book, Thinking, Fast and Slow, Nobel Prize winner Daniel Kahneman described his work with another psychologist, Gary Klein. Kahneman took a comparatively sceptical view of intuition, having studied the frequently dismal performance of supposedly expert fund managers in financial markets. Klein had witnessed intuition in action, having worked with firefighters and grown to admire their decision-making skills in fast-moving situations.

In other words, one leaned more towards data and the other more towards experience. So were they able to find any common ground? And what might they be able to tell us about the tension between the two schools of thought and, by extension, the perceived conflict between the innovative and the established?

The best of both worlds

 Despite their different perspectives, Kahneman and Klein were able to agree on some key points about intuition. They defined it as a capacity to make decisions rapidly by recognising past instances; and they suggested it can be trusted only when it has been gained in particular circumstances – specifically, those where the data set is large enough to be representative and the feedback loop is swift enough for lessons to be learned.

So let’s apply this thinking to predictive policing, which in the US has been found to perform well in connection with some crimes and not so well in connection with others. The algorithms have done a solid job in relation to incidents of drug dealing, assault and battery, gang violence and bike thefts but have proven less impressive in tackling crimes of passion and homicides.

Why might this be so? Kahneman and Klein’s conclusions suggest it’s because crimes such as burglary are so commonplace that the data set, while too vast for the average human to make sense of, is ideal grist to the algorithmic mill; by contrast, because they’re mercifully rare, crimes such as murder generate insufficient data for a meaningful algorithm but enough data to help shape a detective’s intuition and expertise.

So there must be a place for impartial, computer-processed analysis – but this ought to free up time for hard-won expertise to come into its own. To put it another way: it would be a mistake to reject innovation out of hand, and it would also be a mistake to expect it to completely replace human judgment.

The point is that intuition is acquired – and maintained – with practice but is not necessarily transferable. This is why the most radical and useful innovation of all might be to combine the best of the human with the best of the algorithm. Ultimately, expertise in one facet of a job is no guarantee of expertise in another; and, as Kahneman and Klein also agreed, real experts know their limits.

 

Further reading

https://www.newyorker.com/magazine/2006/01/09/deluged

https://leb.fbi.gov/articles/featured-articles/predictive-policing-using-technology-to-reduce-crime

http://time.com/23782/google-flu-trends-big-data-problems/

 

 

 

Comments

Comments are closed.