AWS re:Invent 2022 - How Taco Bell is improving digital availability with ML forecasting (AIM333)
AWS re:Invent 2022 - How Taco Bell is improving digital availability with ML forecasting (AIM333)
Accurate demand forecasting can help a company maximize revenue and minimize inventory costs. Traditional approaches to deploying ML-based forecasting required customers to click through user interfaces or learn how to use APIs and Jupyter notebooks. Both of these increase the barrier to ongoing, sustainable production operation at scale. In this session, learn how customers like Taco Bell are deploying no-code solutions that automate the Amazon Forecast pipeline using a number of AWS services, such as AWS Step Functions and AWS CloudFormation. This solution helps deploy and manage many decoupled and concurrent enterprise-grade workloads.
ABOUT AWS Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts.
AWS is the world’s most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.
#reInvent2022 #AWSreInvent2022 #AWSEvents
Content
0.33 -> - Good afternoon everyone,
1.44 -> and welcome to breakout session "AIM 333".
5.73 -> Today we are gonna be talking about
7.92 -> how Taco Bell is using
10.26 -> machine learning based forecasting
12.33 -> to improve digital availability
14.79 -> using Amazon Forecast.
17.28 -> My name is Brandon Nair,
18.84 -> and I am a senior product manager
20.76 -> on Amazon Forecast,
22.47 -> which is a fully managed
AWS machine learning service
26.52 -> that puts highly accurate demand forecasts
29.79 -> into the hands of
developers and enterprises
32.58 -> with few to no lines of code required.
35.97 -> With me, I have my
colleague Charles Laughlin,
38.34 -> who is a principal AI
ML specialist at AWS,
43.14 -> as well as Niraj Revankar,
44.91 -> who is a director of data
and analytics at Taco Bell.
48.75 -> And he's here to talk about his experience
51.06 -> of using Amazon Forecast
53.07 -> to derive business value at Taco Bell.
59.85 -> In terms of the agenda,
61.44 -> I will start us off
62.52 -> by providing an introduction
into Amazon Forecast
65.91 -> as a fully managed service.
68.31 -> I will talk through our core competencies,
70.59 -> as well as some of our marquee features.
73.71 -> Then I will hand over to Niraj,
75.3 -> who will go through his experience
77.58 -> of how his team used Amazon Forecast
80.67 -> to increase digital
availability at Taco Bell.
84.9 -> We'll then hand over to Charles,
86.82 -> who will walk us through
how Amazon Forecast
89.55 -> is purpose-built to reduce
your time to market.
93.54 -> And he will close off the presentation
95.22 -> by providing you some resources
96.9 -> on how to get started
98.49 -> on using Amazon Forecast service.
104.91 -> Amazon Forecast centers
itself around the science
108.21 -> of using previous historical data
111.24 -> in order to predict future values.
114.15 -> Now, generating demand forecasts
116.52 -> is a key business input
118.59 -> into a broad range of
industries and use cases
122.22 -> that span from inventory planning
124.98 -> to supply chain optimization
128.04 -> to financial planning to
energy demand planning,
131.43 -> to even forecasting workforce
planning requirements.
135.09 -> And that list goes on.
137.37 -> At Amazon Forecast,
139.11 -> we make it easy
140.19 -> to obtain accurate ML
based forecasting models
144.12 -> that are approximately 50% more accurate
147.27 -> than your traditional
statistical forecasting models.
150.63 -> And we make it easy
151.59 -> to deploy these models into production.
157.89 -> Amazon Forecast is built
on the AWS ML stack.
162.48 -> Now, you're able to build
machine learning models
165.96 -> at any level of the AWS ML stack.
169.71 -> However, the bottom two layers,
171.87 -> which are the ML services
173.82 -> and the infrastructure
and frameworks layer
177.96 -> tend to be geared toward those users
181.11 -> that are more akin
182.37 -> to technical expert machine
learning practitioners,
186.54 -> or those that want to invest the time
189.09 -> to build and deploy these models
191.73 -> from the ground up
192.81 -> and manage their own infrastructure.
196.71 -> Amazon Forecast sits at
the top of the ML stack.
200.22 -> What this means is that
204.09 -> we're able to bucketize
205.62 -> all the capabilities of the ML stack
208.05 -> into a service that allows users
210.78 -> to access accurate machine learning models
213.21 -> without the need to be an ML expert.
219.498 -> And as a fully managed service,
221.82 -> Amazon Forecast derives value
224.07 -> for our business customers.
225.54 -> In one, you don't have
to be a technical expert
229.38 -> in order to reap the benefits
231.33 -> of ML based forecasting.
233.79 -> And two, you don't have to invest the time
236.79 -> and the resources to
write out all the code
240.57 -> to train, build, test, deploy these models
244.56 -> by yourselves, and that
saves you time to market.
248.28 -> Later in our discussion,
249.48 -> Charles will take us through
250.56 -> exactly how Amazon Forecast is able
252.69 -> to save you much time to market.
256.62 -> Users of Amazon Forecast
are able to access it
259.41 -> either directly through API calls
262.23 -> or through the AWS management console.
265.74 -> And to use the service,
267.15 -> users will bring in data
268.71 -> in the form of historical data,
270.54 -> which represents their demand forecasts
272.61 -> that they are trying to predict for,
275.01 -> and optionally related data such as price,
278.7 -> and item metadata such as brand name,
281.55 -> which can be very important
283.26 -> in improving the accuracy
285.15 -> of a machine learning model.
287.85 -> Once the data has been imported,
290.79 -> Amazon Forecast takes
care of the processes
294.21 -> to build out these models
295.62 -> and deploy them at scale.
300.63 -> Let's dive a little bit deeper
302.4 -> into the Amazon Forecast service.
306.18 -> Amazon Forecast focuses
on driving business value
310.47 -> for our customers by
targeting three pillars.
314.94 -> The first is to simplify
and automate the process
319.23 -> to obtain accurate ML forecasting models.
323.61 -> The second is to provide capabilities
325.89 -> that allow you to deploy
327.42 -> to operation these models,
329.25 -> as well as manage them
when they're in operation.
332.73 -> And the third is to provide tools
334.83 -> that allow you to derive business insights
337.62 -> from your forecasting models,
339.24 -> so that you're actually able
340.77 -> to make business decisions.
343.32 -> I will go through each of
these pillars in detail.
347.61 -> So the very first pillar
349.02 -> is focused on simplifying that process
351.72 -> to build a machine learning model.
353.91 -> And that process really starts
355.74 -> with your source data.
358.53 -> Now I'm sure many of you in the audience
360.36 -> will empathize with this,
361.47 -> but source data needs to be cleaned,
365.28 -> and it needs to be shaped appropriately
367.59 -> for you to build an accurate
machine learning model.
370.38 -> And let's face it,
source data has pitfalls.
374.46 -> You can run into source data
375.57 -> that has missing values, for example.
378.9 -> Amazon Forecast provides a feature
380.76 -> that allows you to handle such a scenario,
384.06 -> where a user can define
an imputation strategy
387.63 -> of how to fill in and take care
389.58 -> of those missing values.
391.35 -> For example, you can fill
it in by using the mean
394.11 -> or the median of your data sets.
398.19 -> We've also heard from customers
399.99 -> that their data sets or
their demand forecasts
404.174 -> are sometimes also impacted
by external factors,
408.75 -> where their source data does not
410.91 -> take that into account.
412.86 -> For example, the timing of holidays.
415.8 -> What's when companies go
through ebbs and flows
417.84 -> in their forecasts.
420.54 -> With Amazon Forecast,
421.673 -> you are able to enrich your source data
425.19 -> by adding in data from holidays
428.01 -> as well as weather to
create a training data set
431.55 -> to obtain a more accurate
machine learning model.
437.04 -> Once your data has been treated,
440.04 -> Amazon Forecast takes
care of the featurization,
443.79 -> and then it moves over
into the ML build process.
448.26 -> At this stage,
449.31 -> customers have the benefit
450.81 -> of using an Amazon Forecast auto ML
454.44 -> called auto predictor.
456.84 -> An auto predictor uses ensembling strategy
460.38 -> in order to create a more accurate model.
464.85 -> When you call an auto predictor API
466.98 -> to train a model,
468.969 -> it will provision up to
20 clusters including,
473.19 -> it can provision up to 20 clusters,
475.2 -> including neural network algorithms.
478.56 -> It conducts the hyper
parameter optimization
480.93 -> automatically and it goes through
483.72 -> an ensemble routine in order to obtain
487.38 -> a blended ensemble model
489.72 -> across six different algorithms
492.51 -> in order to produce a model
494.28 -> that has the least error rate.
496.74 -> And using auto predictor,
498.51 -> users are able to obtain
a forecasting model
501.78 -> that is 40% more accurate.
507.21 -> Once your forecasting model is trained,
510.42 -> we also provide you with error metrics
513.03 -> and explainability data
515.1 -> so that you are able to better understand
517.5 -> exactly how your time series data
521.19 -> is influencing your forecast.
528.9 -> Amazon Forecast also provides you
531 -> with capabilities that allow you
532.86 -> to operationalize and maintain
534.81 -> your models in production.
538.2 -> With Amazon Forecast,
539.033 -> you are able to host your models,
541.92 -> you are able to compute
inferences on a schedule,
546.36 -> and we provide you with features
548.34 -> such as Model Monitor.
550.8 -> Model Monitor solves the problem
553.02 -> of allowing you to track the performance
556.29 -> of your model or the accuracy
of your model over time.
560.01 -> And the end goal of that exercise
561.78 -> would be to identify
when there's degradation
564.72 -> in the accuracy.
566.64 -> This can frequently occur
568.02 -> when the data that was used
570.33 -> to train your original model
573.27 -> is no longer representative
of your business
575.88 -> in its current state.
577.92 -> When you do identify
579.15 -> that there is such a degradation,
580.95 -> you are able to use Amazon Forecast
582.87 -> to retrain your model
584.34 -> so that you always have an accurate model
586.56 -> that is reflective of your business.
590.76 -> The output of Amazon
Forecast is an S3 file
595.71 -> that contains your forecasts,
597.72 -> along with the quantiles that you specify.
600.87 -> And with it being in CSV format,
603.3 -> the free-flowing nature allows you
605.34 -> to really plug into your
existing data pipelines
609.48 -> and your existing systems
611.34 -> with minimal changes
required to those systems.
615.21 -> Later in our discussion,
616.74 -> Niraj will talk us through
617.88 -> how his team was able
to use Amazon Forecast
621.18 -> and cloud formation templates
623.31 -> in order to integrate into
their existing systems
625.95 -> at Taco Bell.
632.25 -> The third pillar that we focus on
633.78 -> is on deriving business insights,
635.88 -> and here we are primarily focusing on
638.1 -> a new feature called "what-if analysis".
641.73 -> We frequently hear from our customers
644.07 -> that they're looking for
ways to better understand
647.64 -> their business drivers,
649.17 -> and how that impacts forecast,
651.96 -> so that they're able to make decisions
653.73 -> to appropriately optimize
their businesses.
657.03 -> For example, in retail,
659.1 -> a typical scenario is when a retailer
662.07 -> is conducting promotion planning,
665.04 -> and a question a retailer may ask is,
668.19 -> across, say, a thousand SKUs,
671.976 -> 'what is the optimal price point
674.28 -> across these thousand SKUs
676.8 -> that maximize the demand
of a particular store?'
681.247 -> "What-if analysis" is geared to answer
683.46 -> that very same question.
685.74 -> Using "what-if analysis"
with Amazon Forecast,
689.34 -> you're able to transform your future data.
692.25 -> For example, you're
able to reduce the price
694.65 -> by 15%, and with your already
trained ensemble model
700.29 -> that has learned the relationships
702.27 -> between your products,
703.77 -> as well as the characteristics
705.99 -> that drive your forecast trends,
708.63 -> you're able to predict a new scenario,
711.6 -> a new forecast under this new scenario
713.91 -> with a lower price point.
715.943 -> At that stage,
717.12 -> you are able to compare your forecast
719.91 -> between your base scenario
and your new scenario
723.15 -> and identify which is better
725.01 -> for your particular business scenario.
731.49 -> And today, I'm happy to
announce a new feature
734.46 -> that we launched just two weeks ago.
737.34 -> This feature allows customers to forecast
741.42 -> for cold-start products
743.19 -> up to 45% more accurately than before.
748.14 -> Cold-start forecasting
is an age old problem.
752.49 -> How do you create a forecast
754.44 -> when you don't have any historical data
757.32 -> to make that forecast?
759.9 -> This frequently happens in industries
762.45 -> where you see a constant product turnover,
765.9 -> such as in CPG or retail or manufacturing.
771.57 -> And it typically occurs
772.74 -> when companies are
bringing net new products
775.71 -> onto the market for the very first time
779.4 -> or bringing in brands and catalogs
783.21 -> that may exist in the markets,
785.07 -> but have never been sold by that business,
788.61 -> or even expanding product lines
790.98 -> into new territories.
793.89 -> In each of these cases,
795.383 -> there's no historical data
797.46 -> in order to establish your forecast.
800.07 -> And so when you look at traditional
802.41 -> statistical forecasting models
804.57 -> such as ARIMA and ETS, they fall short
808.68 -> because these models need historical data
811.5 -> in order to establish the parameters
813.75 -> that allow you to make a forecast.
817.32 -> And here's where neural networks
818.7 -> really play a key role.
821.13 -> At Amazon Forecast,
823.05 -> we are employing a novel technique
826.35 -> that looks at the item metadata
828.57 -> within your data set to
identify those products
832.77 -> that are most closely related
834.87 -> to the cold-start product
836.43 -> that you are trying to forecast for.
840.07 -> With the neural networks
already establishing
842.37 -> relationships between your products,
844.89 -> we're able to use those specific products
847.83 -> to impute a forecast
850.05 -> and ultimately create
your cold-start forecast.
853.86 -> And I'm happy to say that this product
855.51 -> is fully GA in all regions supported
859.02 -> by Amazon Forecast.
864.3 -> Amazon Forecast already caters
866.01 -> to a wealth of customers,
867.63 -> and that spans both
industries and use cases,
871.26 -> be it retail use cases for example,
874.47 -> with more retail,
876.21 -> or workforce planning with fornamics,
880.26 -> or travel and hospitality
with affordable tours.
884.61 -> We are able to create
forecasting solutions
887.13 -> that meet each of these
different use cases,
890.76 -> and I'm looking forward
to a discussion later
892.89 -> where I could learn a bit more
894.54 -> about your interest in forecast,
896.04 -> and the use cases
896.873 -> that you are looking to support.
898.89 -> But for now, I will hand it over to Niraj,
901.617 -> who will walk us through his experience
903.96 -> of using Amazon Forecast.
909.51 -> Green is go.
- Green.
913.02 -> Thanks, Brandon. It was a great overview
915.87 -> of the forecast product,
917.43 -> and what I took up from there
919.89 -> is companies like us
921.48 -> that have gathered so much data,
924.33 -> the vertical focus for Amazon
927.63 -> and providing some of those tools
929.28 -> in helping like us to bring our own data,
933.18 -> and then tie in these services
935.64 -> to get better time to
market is pretty exciting.
938.94 -> And I think data is core to what we do,
941.64 -> and I'll share a little bit,
942.96 -> but getting that quicker
945.45 -> into the hands of the customers,
946.56 -> and getting the business well
947.46 -> is what I'm most excited about.
949.62 -> Hey guys, thanks all
for joining the session.
951.57 -> My name is Niraj Revankar,
953.37 -> and I'm a director for data and analytics
955.35 -> at Taco Bell,
956.73 -> and I'm responsible for
the data engineering,
959.97 -> business intelligence, data science,
962.82 -> and reliability services,
965.1 -> and really have two main objectives.
968.28 -> One is make sure that we have
972.09 -> superior customer experiences
974.28 -> that's powered by data.
976.5 -> And the second one is making it easier
978.27 -> for our team members that
are in the restaurants,
980.85 -> and making it easier for them
981.99 -> to serve the customers.
984.63 -> A little bit about Taco Bell,
986.49 -> but just by show of hands,
987.66 -> how many of you have actually been
989.16 -> to a Taco Bell?
992.16 -> That's what I was expecting.
994.82 -> We got a funny story,
996.81 -> just the number of folks that have.
999.42 -> We looked at the population of the U.S,
1001.49 -> and the folks that have actually been
1002.66 -> to Taco Bell once,
1004.91 -> it's a pretty large population.
1006.29 -> So, how many of you actually used
1009.23 -> an app or a delivery service
1011.66 -> to get your food, Taco Bell food?
1015.35 -> Sounds about right.
1016.609 -> What's interesting, when I looked at it,
1019.76 -> is delivery for QSR
1022.52 -> was an interesting concept for me,
1024.569 -> but what we are seeing is that yes,
1026.36 -> folks are really gravitating
1027.92 -> towards using digital,
1030.62 -> and it's an important one for us.
1032.99 -> I think it's an important channel.
1034.55 -> We do a billion plus orders per year,
1037.13 -> so that's an incredible number.
1038.69 -> So that's a lot of tacos.
1040.55 -> And one thing I heard,
1043.16 -> we deliver like an average delivery order
1045.89 -> to delivery is less than four minutes.
1048.38 -> So if you start to think about
1049.58 -> the number of restaurants,
1052.16 -> the number of tacos that we sell
1054.56 -> in a record amount of time,
1055.76 -> it's hugely efficient.
1059.69 -> So yeah, it's pretty exciting.
1062.57 -> And data is playing a great part,
1063.98 -> and talk a little bit about that.
1068.93 -> The tech at Taco Bell,
1070.28 -> we are undergoing a digital transformation
1073.49 -> for our technology, right?
1074.51 -> So if you think about a few years ago,
1076.52 -> you had a couple of channels.
1079.04 -> You went through the drive-thru,
1081.86 -> the front counter to order your products.
1084.47 -> But with the expansion
in digital channels,
1087.32 -> we are seeing app,
1088.46 -> we have the kiosk with that delivery.
1093.02 -> So, the touch points have increased.
1095.18 -> And as you can imagine,
1098.72 -> there is a a level of complexity
1101.63 -> that comes with it,
1103.07 -> and how you integrate all those orders.
1104.9 -> You gotta make sure
1105.733 -> that the order's hitting the kitchen,
1107.81 -> that the food is prepared,
1108.95 -> and it's fresh for you guys to pick up.
1111.14 -> So with a lot of this,
1113.66 -> I think the key piece
that I'm excited about
1115.58 -> is the fact that a lot
of these are connected,
1118.79 -> part of our connected restaurant strategy.
1120.98 -> And what that means is that
1122.27 -> each of these systems
push the data to cloud.
1125.99 -> And that's where I come in,
1127.37 -> where I can then take all that data,
1129.2 -> collect, connect it, curate it,
1131.69 -> and make it available for consumption.
1135.95 -> So as we started to think about
1139.31 -> what we want to do with it,
1140.45 -> I think as we get into the Amazon Forecast
1142.73 -> and all the AIML services,
1144.05 -> I think it's key for us to think about
1146.51 -> what was the data strategy, right?
1147.8 -> We said hey look,
1148.85 -> et me take a step back and say,
1150.02 -> okay, well there's a lot of stuff
1150.98 -> you can do with it,
1152.42 -> but we need to make sure
that our architecture,
1156.08 -> our data strategy is first
1157.49 -> fully bolted on, right?
1158.48 -> It's not something that's an afterthought.
1161.57 -> So, data is sitting at the table
1164.54 -> with the technology
leaders to determine, okay,
1166.73 -> what are we gonna do with all this data
1168.2 -> that's coming out?
1170.33 -> Amazon provides the tech stack
1172.25 -> and the range of services
1173.72 -> for us to be able to do that.
1174.92 -> So we said hey look,
1175.753 -> we need a strategy, a lakehouse strategy
1178.28 -> to make sure that we've got
1180.17 -> multiple integration patterns,
1183.05 -> whether it's batch based,
1184.07 -> whether it's event based,
1185.45 -> whether it's real time,
1187.25 -> ability to store it,
1189.59 -> ability to then connect all that data,
1191.6 -> create offline decision models,
1194.42 -> and then make it available for a new man.
1196.1 -> And we'll talk a little bit about that.
1199.25 -> The other piece is it's not
just the restaurant data
1202.37 -> that we're talking about.
1203.27 -> We also have all of the market
1204.85 -> or the customer data.
1206.96 -> As our digital footprint grows,
1209.87 -> the number of customers grow.
1212 -> Folks that want to interact,
1213.35 -> want us to interact with them
1214.91 -> and opt in to email communications,
1217.7 -> we want to be able to make sure
1219.02 -> that we give them superior
customer experiences.
1221.99 -> So you start to think about that,
1223.13 -> and they call us,
1223.963 -> they call us like, there's bad food.
1225.38 -> Hopefully you guys have had instances
1227.84 -> where your food wasn't as as fresh
1229.49 -> as you thought it would be.
1232.07 -> Making sure that we
capture that information
1234.35 -> and then circle back with you
1236.87 -> with offers or recovery processes
1239.66 -> is another key piece.
1240.56 -> So, putting all that together
1243.11 -> into a data architecture was a key step.
1246.68 -> And we've been doing this
1248.12 -> for about a year and a half.
1249.77 -> We've collected a lot of data, right?
1252.53 -> And we've collected a lot of data
1256.76 -> and now we've got our data scientists
1258.53 -> that are working on this.
1260 -> But there's a lot of
questions that come up.
1262.22 -> And what happened this year was,
1263.78 -> as we were looking at digital channels,
1265.31 -> we saw that yes,
1266.143 -> our digital channel growth is high.
1269.63 -> During Covid, I think
that's when I started
1271.79 -> to actually install QSR apps,
1274.76 -> 'cause I wanted contactless pick-up.
1276.86 -> I thought after Covid
it was gonna go down.
1279.11 -> What we were finding is
1279.943 -> that behavior is here to stay.
1281.6 -> I think it's the convenience factor.
1284.33 -> So we wanna win in digital,
1286.16 -> and a key piece of this
is how do we make sure
1290.72 -> that it's available,
1293.09 -> the channels online and
the channels available.
1295.1 -> And I'll talk a little bit about
1296.608 -> why that's important.
1301.46 -> So, we have our own channels,
1303.56 -> which is our e-commerce
and our kiosk, right?
1305.6 -> We consider them our own channels.
1308.12 -> When you place an order,
1308.953 -> we have IOT based alerting systems
1311.42 -> to tell us if things go down
1313.85 -> so we can take an action.
1315.08 -> But when you are working
1316.55 -> with a third-party delivery,
1318.56 -> when you place an order on DoorDash,
1321.71 -> it goes to the app,
1323.3 -> there's an API integration
1324.74 -> with our POS and a kitchen system.
1328.04 -> We need to be able to then say okay,
1329.36 -> when is the driver gonna be dispatched
1331.25 -> so that we can make the food fresh
1333.17 -> and hand it over to the driver?
1334.88 -> So there's a lot of complexity,
1336.23 -> there's a lot of integrations that happen.
1340.13 -> But with any technology,
1343.16 -> what one of the things that kind of drive
1345.14 -> some of these is algorithms.
1347.15 -> Now if for some reason,
1348.77 -> there's an integration point
1350.96 -> where it breaks down,
1351.86 -> let's say it doesn't go to the kitchen
1353 -> or the POS and it could happen,
1356.06 -> how do you then recover from that?
1358.67 -> Because if you have
too many cancellations,
1361.07 -> if you're a DoorDash app,
1363.92 -> you're gonna say okay,
1364.753 -> well it's bad customer experience.
1366.83 -> So they're gonna shut it down.
1369.95 -> And they have algorithms
to say that, right?
1372.41 -> After a certain point,
1375.08 -> we are gonna shut down the store.
1376.82 -> Now the challenge is,
1377.72 -> is that we have to manually enable
1379.7 -> that store for DoorDash.
1381.92 -> So if you went into the
app and you say hey look,
1383.78 -> why don't I see this restaurant on my app?
1385.85 -> The reason is it's got disabled,
1388.01 -> and nobody enabled it.
1390.86 -> And we are working on trying to make that
1392.48 -> an automated process as well,
1394.34 -> but it does require human identification
1396.29 -> and intervention to
bring the store online.
1399.89 -> So what came to our team was like hey,
1402.5 -> how do I make sure that we know
1405.56 -> whether a store is online or offline,
1408.11 -> and how do I use data science
1409.7 -> to then get ahead of that issue?
1414.44 -> So the objectives were improving
1418.01 -> the stakeholder visibility to that, right?
1420.28 -> So I have a franchisee,
1422.48 -> I have a hundred stores,
1423.8 -> I wanna make sure that I know
1424.94 -> whether a store is online or offline.
1428.39 -> We could do that with all of our data
1429.89 -> 'cause it was priority.
1430.76 -> We could say hey look,
1431.593 -> you didn't have a transaction yesterday.
1433.52 -> Maybe there's a problem. Go fix it.
1435.95 -> And a lot of times,
1436.783 -> that improved it considerably.
1439.01 -> But we want to take it to the next level
1440.6 -> where we can do this in real time.
1443.15 -> And where I think this all comes together
1445.34 -> is now we've got our POS systems
1446.9 -> that are real-time transmitting
1448.04 -> the data to cloud.
1449.84 -> We've got an offline decision system
1451.64 -> that we work with the Amazon Forecast team
1454.34 -> to kind of build the model.
1456.14 -> Now we can start to
connect and say hey look,
1458.03 -> within a few hours,
1459.44 -> we can determine whether or not
1462.68 -> the DoorDash may be off,
1464.87 -> or the channel may be off.
1467.66 -> So I think the key aspect there
1471.53 -> is optimization, right?
1474.2 -> As we start to think about,
1475.7 -> there's things that we could do
1477.029 -> with data and analytics,
1478.82 -> but then how do we optimize it?
1481.34 -> And that was one of the key objectives,
1484.85 -> and keep in mind,
1485.683 -> so if you think about
a billion transactions,
1488.18 -> a big portion of that is digital.
1490.34 -> A high number of that is delivery.
1493.7 -> And even if we fix a certain amount,
1497.69 -> it equates to almost adding
a few hundred stores,
1501.14 -> virtual stores to the Taco Bell system,
1503.42 -> 'cause if you think about our size,
1505.76 -> which is why this was such
1507.29 -> an important business problem to solve.
1511.19 -> So what we did was now we had the data,
1516.68 -> as most of you in the organization know,
1519.08 -> you've got a plan,
1520.13 -> you've got data scientist teams
1522.44 -> that are working on key projects,
1525.41 -> in our case, cook schedules,
1527.18 -> making sure we order the right inventory,
1529.64 -> all of that stuff.
1530.6 -> That's important, right?
1531.59 -> And when this issue came up to my team,
1534.68 -> we were like okay well,
1536.03 -> how do I continue to optimize?
1539.78 -> This is where we partnered
with Charles and team,
1541.64 -> which is what I'm most excited about.
1544.31 -> I said hey look,
1545.143 -> we've got all this data,
1546.58 -> we want to turn around and look at,
1548.93 -> what does this mean from
a business standpoint?
1552.59 -> So we got the NextGen POS data,
1554.38 -> we got the offline decision model
1556.85 -> that Amazon built,
1558.56 -> and we used a lot of the AWS services
1560.93 -> then to say okay,
1561.763 -> well let me send you
an email within an hour
1566.33 -> where we see that there's a problem,
1567.56 -> 'cause there's a lot of false positives,
1568.85 -> 'cause keep in mind that a delivery order
1572.69 -> is kind of sparsely populated, right?
1576.5 -> You're not gonna have delivery orders
1578.12 -> throughout the day.
1578.99 -> There are day parts,
lunch or maybe late night,
1582.11 -> or evenings where you can
have more delivery orders
1584.66 -> than the other.
1585.493 -> So we wanted something
that's accurate enough
1587.69 -> at a store level
1589.52 -> that kind of tells us that information,
1593.27 -> which is what we did
with the Amazon Forecast.
1595.31 -> And we got that, we ran the POS,
1597.11 -> the real-time POS that was coming in.
1599.72 -> And then within a couple of hours,
1601.25 -> if there was something like we expected
1603.62 -> during the day part for
this particular store,
1605.6 -> high delivery store, for example,
1609.59 -> we started to send emails
to those stores, right?
1613.4 -> Internally we wanted to try
1614.84 -> to track what's happening.
1618.14 -> So, from a prior day,
1620.12 -> you had to wait for a day previously
1622.37 -> to know whether or not
your store was offline.
1624.47 -> Now, within a couple of hours,
1626.66 -> the stores knew that there's some problem,
1629.84 -> and as we kind of improve this,
1633.89 -> the forecasting capabilities,
1635.54 -> we're getting real-time
store hours information
1638.51 -> to make sure the store
was not really closed
1640.28 -> at that point,
1641.66 -> and it was really truly
an aggregator issue.
1646.25 -> The ability to tweak that,
1648.926 -> an ability to bring that all together,
1650.57 -> send an email, is very powerful.
1653.6 -> So one of the things our community likes,
1656.15 -> a franchisee community is like,
1657.47 -> they don't wanna go to a report
1658.91 -> and find out if something's
an issue, right?
1662.48 -> It needs to be on your phone,
1663.47 -> it needs to be a text message,
1665.27 -> it needs to be something actionable.
1666.59 -> And if you look at this,
1667.663 -> there's a lot of things behind it.
1670.91 -> We were able to do that
1671.81 -> in I think a little over four weeks.
1676.43 -> And to be able to get to that speed
1678.44 -> and impact and make an impact to business,
1682.55 -> I think was, to my mind,
1684.38 -> one of the biggest outcomes.
1686.36 -> So I think Charles is
gonna go into details
1689.78 -> on how this whole thing works,
1691.404 -> but really excited,
1693.02 -> and thanks for your partnership, Charles.
1694.64 -> I think it was a fantastic effort
1696.59 -> of us bringing the data
1698.06 -> and you guys bringing your expertise.
1699.44 -> So I'll hand it over to you. Thank you.
1703.644 -> - Thank you, Niraj.
1704.477 -> Thank you for being here with us today.
1706.13 -> It was really a high honor
1707.18 -> to work with you and your team.
1709.7 -> When we had our first call
1711.2 -> with Niraj and his team at Taco Bell,
1713.6 -> they had a very clear mission
1714.74 -> on something that they
wanted to accomplish,
1716.45 -> the change that they wanted to instrument
1717.98 -> in their business,
1719.12 -> and they were also very dedicated
1720.77 -> in making that happen quickly.
1724.01 -> And we weren't racing,
1726.05 -> but as it turns out,
1726.92 -> in two and a half weeks
1727.76 -> we had two production
forecast operations working,
1730.64 -> all fully automatic,
1732.35 -> and then they bolted onto that later on,
1734.06 -> the messaging that Niraj described.
1737.12 -> So I need to go forward to the next slide.
1741.26 -> I was focused on the audio working.
1744.17 -> So, time to market is really important
1746.09 -> for companies that wanna optimize.
1748.55 -> You want to start creating
better business value,
1751.1 -> you wanna start serving
your customers better.
1753.5 -> And the thing is,
1754.333 -> you don't wanna wait
quarters for that to happen.
1756.14 -> You don't want perhaps
even months or weeks
1758.21 -> for that to happen.
1759.043 -> So I think Niraj's story has showed,
1761.3 -> that is evidence that it is possible
1762.86 -> to make a change quickly
1764.21 -> and in an organization,
1766.16 -> even in a large
organization like Taco Bell.
1771.29 -> And the reason that we're able
1772.91 -> to make things happen
in quick time to market
1775.01 -> are based on two dimensions.
1776.6 -> The first dimension
1778.1 -> is the Amazon Forecast service itself.
1781.22 -> And Brandon articulated
1782.51 -> some of these details earlier,
1784.07 -> handling all the phases
1785.21 -> of the ML life cycle
1786.8 -> from data appropriation in the beginning
1789.41 -> to splitting the data.
1791 -> But really one of the hallmarks here
1792.35 -> is in the science and the RND
1793.79 -> that comes from having
1794.63 -> multiple machine learning models,
1797.15 -> because the reality is
that the demand patterns
1799.1 -> across time series differ.
1800.69 -> Some are volatile,
1802.07 -> so there's more history on others,
1804.38 -> some have more seasonality.
1806.03 -> So it's very unlikely that one model
1808.55 -> is gonna fit best all the time series.
1812.3 -> And then finally the service
1813.83 -> is able to handle the
infrastructure management.
1819.38 -> So from a developer perspective,
1822.11 -> or from an expert ML perspective,
1824.54 -> no matter where you are on the spectrum
1826.16 -> in terms of persona,
1827.66 -> when you want to build a model,
1828.83 -> you simply make an API call,
1831.14 -> and we'll look at that here.
1833.03 -> And when you make the API call,
1835.1 -> underneath the scene,
1835.97 -> Amazon Forecast is already looking
1837.5 -> at descriptive statistics
1839.33 -> to understand the size of the data
1840.92 -> and the distribution of the data,
1842.57 -> and it spins up a cluster
that's properly sized
1845.54 -> to accommodate your data.
1846.95 -> So that decision point about,
1848.267 -> did I build a cluster too large
1849.86 -> and wasteful in resources,
1851.9 -> or was the cluster too small
1853.19 -> and run out of memory,
1854.36 -> that decisioning now is made automatic.
1857.75 -> And the first cluster,
1858.68 -> I think Brandon mentioned,
1859.58 -> we can launch as many as 20 of these
1861.77 -> to do experimentation,
1863.48 -> lots of kinds of models.
1864.59 -> But the first one here,
1865.79 -> maybe that's an ARIMA model
1867.05 -> or a statistical model.
1868.94 -> And then we'll have also
1869.9 -> convolutional neural network models
1871.61 -> with different hyper
parameter optimization
1873.86 -> and different back testing windows.
1875.63 -> But the thing is that
1876.463 -> Amazon Forecast service
orchestrates all of these,
1878.66 -> spinning up the clusters,
1880.16 -> waiting for them to complete.
1881.93 -> There's obviously back testing data
1883.43 -> in each of the clusters,
1884.84 -> so new accuracy metrics
1886.19 -> on every individual time series.
1890 -> And then there's an ensembling phase.
1891.86 -> So for every single time
series in the data set,
1894.41 -> in this hypothetical case,
1896.93 -> with item and location, for example,
1899.6 -> item number one might be
in a statistical model,
1902.99 -> item number two might be beneficial
1904.79 -> from a neural network model,
1906.59 -> and then item three might be
1907.79 -> a 50-50 blend of the two.
1910.82 -> But the idea is that after ensembling,
1912.89 -> there's a unique recipe created
1914.27 -> for every single time series.
1915.997 -> In the Taco Bell case,
1917.18 -> that would be by store,
by day part, by hour,
1920.51 -> and then by delivery
channel, just as an example.
1923.45 -> And then that model itself is saved
1925.13 -> as a graph artifact
1926.51 -> that can then be called for inference.
1930.68 -> So normally, instead
of having to go through
1932.48 -> and spend weeks and weeks
1933.56 -> and iterate and do hyper parameter
1935.21 -> tuning and building,
1936.08 -> which is sometimes common,
1937.88 -> Amazon Forecast is making
all of this possible
1940.148 -> generally in about a day.
1941.75 -> So in the case with Taco Bell,
1943.79 -> we did complete that cycle
1945.35 -> in two and a half weeks,
1946.67 -> but they were able to do a POC
1948.26 -> within I think just a day,
1951.38 -> and the statistics were there,
1952.73 -> and then the team had to go back
1953.84 -> and they were able to look at it.
1955.76 -> But that does mean
1956.593 -> at the end of a couple days there,
1958.34 -> there was some stats
1959.6 -> without having to go through
1960.5 -> and build all the models
1962.06 -> and all the infrastructure management.
1966.74 -> So continuing on,
1967.79 -> in addition to Amazon Forecast,
1969.32 -> there are some other
complimentary AWS services
1971.72 -> that make this possible.
1973.31 -> One of them is in cloud formation.
1975.38 -> So think of cloud formation
1976.79 -> as an infrastructure service that provides
1980.78 -> repeatable cloud formation templates.
1983.42 -> And that way, there's consistency
1984.83 -> with the deployment.
1986.09 -> So I've got a QR code that
1987.26 -> we'll be sharing with you
1988.31 -> for a GitHub site.
1989.6 -> It provides a YAML file,
1990.8 -> and this is what Taco Bell used,
1992.39 -> to spin up all the cloud infrastructure
1994.1 -> to support the workflow.
1997.37 -> It'll do things like create S3 buckets
1999.35 -> and create reliable and
consistent permissions.
2002.29 -> And one of the next items
I'll be talking about
2004.75 -> is step functions.
2005.98 -> And if you saw Dr. Vogels today on this,
2009.16 -> he talked about synchronous calls
2011.17 -> and about orchestration
2012.91 -> with step functions and EventBridge.
2018.741 -> So here, we're looking at the importance
2021.04 -> of using step functions to orchestrate.
2023.2 -> So, cloud formation did the deployment,
2025.09 -> and then orchestration
here from execution,
2027.25 -> the execution happens with step functions.
2031.57 -> So for example,
2032.403 -> step functions are able to do things
2033.76 -> like have a process
2035.14 -> that goes from A to B is shown here.
2037.09 -> So, importing data to Amazon Forecast,
2040.57 -> that's an API call.
2041.92 -> And what happens is underneath,
2043.51 -> there's a cluster provision to import data
2045.76 -> from S3 into Amazon Forecast.
2048.22 -> And that is a synchronous call,
2049.93 -> an asynchronous call.
2051.46 -> So while that process is running,
2053.35 -> the step functions are then
2054.52 -> monitoring the process to see
2055.87 -> when the import is completed.
2058.9 -> And then only after the
import is completed,
2060.7 -> then it moves forward
to creating the model.
2062.65 -> And that's the slide
that we just looked at,
2064.33 -> where you see the clusters
2065.26 -> being built for training, and so forth.
2071.38 -> So when the cloud formation
template is deployed,
2073.48 -> one of the things it's doing
2074.77 -> is creating those step functions,
2076.48 -> and the step functions are also have
2078.13 -> a workflow built for them.
2080.74 -> And we looked at the lake formation,
2084.19 -> as Niraj mentioned with Taco Bell,
2085.93 -> their data resided in S3 data lake.
2088.78 -> So in terms of orchestrating the workflow
2090.58 -> from end to end,
2091.9 -> and again, this is part
2093.04 -> of the quick time to market,
2094.93 -> the first part that is
already delivered there,
2097 -> and this is available in our GitHub site,
2099.01 -> is the Athena connector.
2101.41 -> So what that means is
you put in your metadata,
2103.12 -> you put in the SQL statement,
2104.74 -> and that'll go get your historical data
2106.51 -> where that resides at rest.
2108.31 -> And it can be in different
heterogeneous databases.
2112 -> And this is the thing,
2112.833 -> like the historical demand
2114.73 -> and item metadata and others.
2118 -> And when those queries complete,
2119.86 -> that lands this data set out on S3
2122.08 -> as different objects.
2123.61 -> So this is used for model training,
2125.38 -> but it's also used to
produce model inference.
2131.53 -> Next along this workflow
with the step functions
2134.74 -> is the import process.
2136.72 -> And so the way this works is
2138.43 -> the step function will start
2140.77 -> as soon as the Athena connector completes,
2143.41 -> the import begins, without any delays.
2145.84 -> It's all serverless,
2146.86 -> and you don't have to worry about
2147.85 -> building servers that are gonna manage
2150.04 -> and sleep and wait and so forth.
2152.53 -> And that will start the process
2153.85 -> to import data into Amazon Forecast.
2158.44 -> And that process may
take minutes to longer
2160.84 -> depending on the size of the data,
2162.52 -> but when it's completed,
2163.84 -> the next part is it will
signal that it's completed
2166.72 -> and then start the process
2167.89 -> to build the predictor.
2170.59 -> So think about this as
that trained ML model.
2174.033 -> And the predictor itself,
2175.09 -> this is the part where it's gonna
2176.11 -> spin up the clusters
as many as 20 of them.
2178.54 -> And the predictor training
2179.47 -> can take minutes to hours to several hours
2181.78 -> depending on the size
and scale of the data.
2184.51 -> And that maybe, again,
2185.65 -> so think of that as a
long learning process.
2187.87 -> But what the step function is gonna do
2189.07 -> is monitor that for completion.
2191.11 -> And when it is completed,
2192.67 -> it's also gonna export the data
2194.17 -> from that predictor to S3.
2196.48 -> What that means is you'll have
2197.53 -> time series statistics available on S3
2200.86 -> and be able to look at
the spread of accuracy
2202.66 -> across different time series.
2205.84 -> That's also gonna produce a saved model.
2207.997 -> And that saved model is used next
2210.31 -> when the Amazon Forecast
2211.75 -> hits the next step function,
2213.22 -> which is gonna produce the predictions.
2215.59 -> So it's at this point that the API call
2217.99 -> is gonna be made,
2219.04 -> and it's gonna take the
data that's been imported
2221.83 -> and it's gonna pull it
through the saved model,
2223.57 -> the name model,
2224.403 -> the graph that's existing
2226.15 -> to produce future predictions.
2230.41 -> And then the future predictions
2231.67 -> are gonna land on S3.
2233.23 -> They're simple CSV files.
2234.85 -> They can also be Parquet files.
2236.89 -> So this is gonna be at the
item level by time step
2239.92 -> and also for the different
quantiles you choose
2241.72 -> as a business.
2243.97 -> And most of our customers
want to take that data
2245.89 -> and then orchestrate it and deliver it
2247.63 -> to many different target systems.
2250.03 -> These include different low
latency databases for serving.
2253.6 -> They may also include
different BI platforms.
2256.09 -> They can also be treated as events
2257.65 -> that get consumed and processed
2259.3 -> into different ERP systems,
2261.13 -> sharing with partners in
your supply chain network,
2264.19 -> and the list goes on.
2270.761 -> All right. I think in
terms of the takeaways
2274.33 -> that we would like just to highlight here
2275.77 -> would be that Brandon
talked in the beginning,
2278.29 -> Amazon Forecast is a purpose-built
2280.103 -> time series forecasting service.
2283.12 -> It has all the state of the art
2284.56 -> time series models,
2286 -> a rich model library,
2287.59 -> and we continue to
iterate and build on this.
2289.66 -> In fact, the cold-start was just an update
2291.76 -> to the models released a couple weeks ago.
2295.42 -> And then Niraj talked about how Taco Bell
2297.52 -> was able to instrument
production workflows
2300.01 -> in a short time to market,
2301.39 -> and he also described the value prop
2303.01 -> that that's provided his business
2305.2 -> and his business stakeholders.
2307.27 -> And then finally,
2308.103 -> I showed you an introduction
to step functions
2310.6 -> as a way to orchestrate
first with cloud formation
2313.27 -> as a way to deploy.
2314.44 -> And then secondly,
2315.273 -> using step functions to orchestrate.
2322.81 -> And then finally here, if you'd like,
2324.16 -> this is the QR code
2325.33 -> that will lead you to our GitHub site,
2327.4 -> that provides essentially there's gonna be
2329.02 -> a set of directions there to a README.
2331.09 -> There's also a YAML file available
2333.19 -> that will deploy the cloud formation,
2335.44 -> and you can customize that
2336.55 -> to your schema and your data.
2340.27 -> So you're welcome to take that.
2341.53 -> You could also do a web search
2342.7 -> on AWS forecast samples,
2345.354 -> GitHub as well.
2347.89 -> So you would actually do
this at your own pace,
2351.07 -> but if you have any questions about this
2352.69 -> or need any support,
2354.22 -> please do reach out to
your AWS account manager
2357.04 -> or your solution architects
2358.21 -> that you may be working with,
2359.62 -> and we're happy to help,
2362.83 -> come to the conversation with you.
2365.47 -> And that's all I had.
2366.303 -> We've got some time,
2367.136 -> but I'd like to invite
the colleagues up here,