A machine learning system tells heated stories about commercial real estate auctions it watches, encouraging customer reengagement and new customer acquisition.
I recently visited a customer whose product is a platform for online property auctions. As I sat in the “control center” watching an auction in real time, numbers flying across the dashboard, I sweat. Here was a shopping mall, market value twenty one million dollars, ticking up further and further over the seller’s minimum price of twenty million.
Two bidders, John and Serena were fiendishly battling it out, Serena waiting until the last second to place a high bid, John immediately responding with a determined bid, consistently five hundred thousand higher. Again and again the two fought back and forth, driving the price up, up, up. The price was now over twenty four million dollars! I gripped my laptop.
Five seconds remaining. Serena slammed in another big bid, extending the clock. Immediately, John, not to be outbid, submitted another bid. Another five hundred thousand over Serena. Serena couldn’t stand it; he finally backed down. Time expired. John scored the perfect mall to complete his Indiana portfolio, and the seller collected a huge pay day: four and a half million dollars more than she was willing to accept!
As I sat there, simultaneously adrenaline-charged and exhausted, I asked how the company would showcase this dramatic example of their product delivering a great win for both buyer and seller. “This happens every day,” they casually answered, “the data is stored in our data lake. Only the analysts will see it again.” What an opportunity! Let’s build a system to tell this story to future users of the platform, highlight the huge benefit of using the product, and drive customer engagement.
Building up a narrative summary, like I’ve done in the top section of this article, is an emotional, complex process. We want to distill out the key details of an extended procedure with hundreds of user interactions and tens of thousands of data points. Assuming there are too many auctions per week to have a human analyst perform a summarization for all of them (there are), how do we go about completing this task?
Narrative summary by machine learning system is a widely researched and (at the time of writing) newly applied suite of methodologies. I’ll show a common one for analogy and then talk about the application to the auction setting and data referenced above.
Automated image captioning systems are widely used in industry to build a narrative summary of the information contained in photographs. The system is composed of two pieces linked together sequentially: a convolutional neural network (CNN) and a recurrent neural network (RNN). An image is fed into the CNN, which effectively compresses the image to its highest entropy representation, called a latent vector. This latent vector is an optimally condensed array of floating point numbers representing all the information in the image. Latent vectors exist in an information space where similar conceptual characteristics point in the same direction and dissimilar concepts point in opposite, or “orthogonal” directions.
An example: a latent vector representing a picture of a boy throwing a ball should point in a similar direction to that of a latent vector representing a picture of a boy throwing a frisbee, but point in a very different direction than that of a latent vector representing a picture of a train pulling freight cars. This condensed latent space is what is fed into the second half of the image captioning system.
The latent vector output by the CNN is the input into the RNN, which streams across this input signal left to right and outputs a stream of integers, each corresponding to a word in the English language. This encoded, or “tokenized,” output is translated back into English words by simple substitution, et voila!, you’ve got an English summary of what’s in your image. From a high level, the architecture is pretty minimal, right? An image goes into a CNN, which feeds an RNN, which feeds a reverse tokenizer. That’s it.
In the case of John and Serena’s bidding war, there’s only one simple change to make: replace the CNN with an RNN (or, for the nerds out there, replace the CNN with a multi-head self attention transformer model). The architecture then looks like this: auction data goes into an RNN, which feeds another RNN, which feeds a reverse tokenizer. Simple enough. The key is getting the auction data into the right shape. Since this isn’t a machine learning scientific publication, I’ll leave these boring details out, and instead give you a picture of what the overall architecture looks like.
The goal of a narrative summary is to identify key details of an extended procedure that has many data points and tell a story about them. Doing this using human capital is costly and slow. Drive customer engagement with narrative exposition rather than cold data using an automated narrative summarization system. Rather than exposing vast data and systems to users, turn it all into engaging stories using machine learning.
Have a question about this blog? Ask the Author - graham.ganssle@experoinc.com
Tell us what you need and one of our experts will get back to you.