Beyond deep learning and neural networks
By   |  August 21, 2015

In this exclusive interview, Prof. Dr Micheal Feindt, Blue Yonder’s founder and Chief Scientific Advisor, tells us more about the company’s predictive analysis technologies and how a new kind of decision-making automation could soon revolutionize the way science – as well as business – are made.

HPC Review: Tell us a little bit about how Blue Yonder was founded and what it provides.

Michael Feindt: Blue Yonder was founded in 2008 from my earlier company Phi-T and the OTTO group, world’s second largest distance selling company after Amazon. In late 2014, the international growth private equity firm Warburg Pincus announced an investment of  $ 75 mio to Blue Yonder, the largest real tech investment in Europe in 2014.
    
Blue Yonder combines world class data science with professional business software, deep business knowledge in many verticals and a cloud service offering in order to serve companies in predicting the near future (including uncertainty and risk calculation) and optimize and automatize decisions. 

Usually, this has deep consequences and is really disruptive.

HPC Review: What was the rationale for offering Blue Yonder’s predictive analytics as a cloud service rather than as an application to be run on in-house resources?

Feindt: Our experience shows that for many companies it is very difficult to attack large data science projects because they need expertise in many fields — their business experience, but also mathematics, statistics, machine learning, software engineering, data handling, as well as hardware and operations. We feel that it is way easier, faster and efficient if we cluster all this additional expertise in Blue Yonder so that our customers can concentrate on their core business and outsource all mathematical and technical complexity. Also, exceptional data scientists like to work in a larger group of excellent data scientists. 

In contrast to the usual large on-premise-software projects the initial investment is much lower, the time to market is reduced drastically, and the failure rate is zero.

HPC Review: The service the company offers is based on the NeuroBayes algorithm, which you developed.  Can you describe in layman’s terms what the algorithm does?

Feindt: The core of NeuroBayes is a neural network of the second generation, with Bayesian regularization, that next to simple classification is able to predict individualized complete probability distributions for real valued quantities, which are the basis for optimal decisions.  However, over time more and more robust preprocessing steps were introduced, and many more other efficient algorithms were developed at Blue Yonder. Generalizability, robustness, learning and prediction speed, and scalability are important design criteria for our algorithms. Recently, we also focused on interpretability (understanding) and the reconstruction of causal effects from historic data. So, the Blue Yonder algorithm library is much more than the original NeuroBayes algorithm. 

In all cases, we analyze large complex systems and automatically learn from past examples what observable quantities (of any kind) Review can say about another quantity in the near future, e.g. the number of Granny Smith apples in the XYZ Supermarket in Baker Street tomorrow. The prediction is given in form of a probability density, i.e. each possible future (number of apples sold tomorrow in that shop) is assigned a probability. Of course, this distribution should be as narrow as possible, but not narrower. The real future must be described correctly by this. Thus, statistical statements are individualized. On this basis mathematically optimal decisions can be taken, given we know the cost function of deviations of the future realization from our decision. 

HPC Review: How is the technology different from other neural network schemes we’ve heard about – like the ones being used by search engine companies to classify images?

Feindt: As already stated, the development is going into the opposite direction: we do not try to mimic the human brain, but to build efficient, robust and fast models. Bayesian statistics is a key ingredient. The “neural” character is not important in the recent development, but the possibility to predict conditional densities is.  

Hierarchical models play a role, but we find it often more convenient (and way faster) to use our own neural networks for defining the hierarchy instead to let a deep network learn it.

HPC Review: What kinds of organizations are using the Blue Yonder offering?  Can you talk about some specific problems that are being addressed?

Feindt: NeuroBayes was originally developed for experimental elementary particle physics, and still is used very successfully at experiments at CERN (Geneva), Fermilab (USA) and KEK (Japan).

One of the latest developments was the completely automatic reconstruction chain for B factories – here automation was more than 2 times as efficient in reconstructing B mesons than 400 physicists in 10 years together. The other is the implementation of the NeuroBayes expert in hardware for the next generation B-factory experiment Belle II: Here, more than 10 billion decisions will be taken directly at the sensor array in order to find out which parts of the detector should be read out to the computers at all.

Large businesses in retail, travel and transport, and industry are the most important customers of Blue Yonder offerings. Blue Yonder performs demand predictions and even complete decision automation for the complete supply chain for many retailers and CPGs. Dynamic pricing in Internet and also brick and mortar shops is another hot topic with very large potential. In marketing optimization we invented a new algorithm in order to predict whether a customer will change his behavior by getting sent a catalogue.

HPC Review: For businesses in general, how can the technology help improve their operation?  In particular, how much can predictive analytics automate business processes?

Feindt: One of the killer applications is the prediction of demand of each single article in each single store each day and the computation of optimal orders. Especially for perishable food this is of great value – for one customer we could avoid food waste in the order of 25 mio € in one year, but simultaneously have less out-of-stock situations. The secret is individualization – to optimize not only one strategic goal, but to break it down to the thousands or millions of operational decisions each day. No human can take into account so many factors on so many articles each day in order to get it right. 

The term “prescriptive analytics” stands for not only delivering the prediction, but also to optimizing the decision to be taken on this basis and give that as a recipe to the human expert. The fun stuff is the observation that cognitive biases also let the human decide wrongly in this case. Often gut feeling overrules the machine decision. The full potential, up to four times the effect of prescription, is only gathered if a full automation (99.9%) is realized, and only exception handling is left to the human expert.  
 
There are many other examples like this, they all go into the same direction. I am very confident that more and more operational decisions, and thus also white collar work, will be automated.

HPC Review: Some business people will be apprehensive about relying on this level automation for decisions that were traditionally under the control of humans?  What would you say to placate those fears?

Feindt: That’s right. Trust has to be built, and references obviously help. But already Lenin knew: To accept anything on trust, to preclude critical application and development, is a grievous sin. Thank god you can measure the improvement. In A/B tests you can prove that modern algorithms perform better than human decisions on many, even classically contradictory KPIs. The usual thing is to start the test with small groups (shops, articles), prove that it really works there, and then to roll it out fully in a few steps, with control at each stage. This way risk is minimized and trust is built.

But independent of all rational arguments often there is quite some resistance against any innovation and change — especially in hierarchical systems. It needs C-level sponsors and good concepts on how to communicate and organize the change. The danger in not going for automation or delay it too long is the sharp competition: the advantage is so large that it might be killing complete companies if only the competitors get more efficient.  

© HPC Today 2024 - All rights reserved.

Thank you for reading HPC Today.

Express poll

Do you use multi-screen
visualization technologies?

Industry news

Brands / Products index