Thank you.

For this upcoming stochastic optimizer, could you describe the inputs/parameters data that is needed/required for the optimization to run?

Thank you. Your previous answer is also relevant - it helped me understand more about the actionrwd function as a whole.

Are you able to provide coding example of how to (or how Lokad typically overcomes this limitation in real-world situations), for both the order multiplier and the MOQ? Not every product has the order multiplier or MOQ pattern or requirement.

And, similarly, if actionrwd is not the solution for this type of situation, how do you overcome this in general? Coding examples will also be very helpful here as well.

Thank you.

Thank you. I apologize if my earlier question was not clear to begin with.

To clarify, I was asking from the perspective of the MOQ that I place on my customers, or if the customers have an MOQ or order multipliers when they purchase this item from me (due to logistics constraints or economy of scales on the transportation cost). In another words, I do not have an MOQ or order multiple to my customers, but they have instituted this requirement because of transportation efficiency and constraints.

How do I take these factors into account when generating a forecast? Currently, the forecasts are generated using the actionrwd.demand function. However there are no parameters to account for the MOQ or the order multiples, and the smoothed demand and the forecast are always way under-fitted for products with these requirements.

I hope this is clear.

s40racer 9 months | flag | on: Implement Forecast at Monthly level

I can also use some guidance on how to change from a weekly to a daily implementation from a coding perspective.

s40racer 9 months | flag | on: Implement Forecast at Monthly level

Thank you.
Would you mind elaborating a bit more on the day-of-week multiplier concept you mentioned above?

s40racer 12 months | flag | on: Forecast Analysis - Forecast Quality

Now I encounter another issue. The code below follows what I posted initially.
```envision
// ///Export

quantileLow1 = 0.3
quantileLow2 = 0.05
quantileHigh1 = 0.7
quantileHigh2 = 0.95

ItemsWeek.One = dirac(1)
ItemsWeek.Demand = dirac(0)

where ItemsWeek.FutureWeekRank > 0
ItemsWeek.Demand = actionrwd.segment(
TimeIndex: ItemsWeek.FutureWeekRank
BaseLine: ItemsWeek.Baseline
Dispersion: Items.Dispersion
Alpha: 0.05
Start: dirac(ItemsWeek.FutureWeekRank - 1)
Duration: ItemsWeek.One
Samples : 1500)

// ////BackTest Demand

keep where min(ItemsWeek.Baseline) when (ItemsWeek.Baseline > 0) by Items.Sku >= 1

ItemsWeek.One=dirac(1)
ItemsWeek.BacktestForecastWeekRank = 0
where ItemsWeek.IsPast
ItemsWeek.BacktestForecastWeekRank = rank() by Items.Sku scan - ItemsWeek.Monday

keep where ItemsWeek.BacktestForecastWeekRank >0 and ItemsWeek.BacktestForecastWeekRank < 371

where ItemsWeek.BacktestForecastWeekRank > 0
ItemsWeek.BackTestDemand = actionrwd.segment(
TimeIndex: ItemsWeek.BacktestForecastWeekRank
BaseLine: ItemsWeek.Baseline
Dispersion: Items.Dispersion
Alpha: 0.05
Start: dirac(ItemsWeek.BacktestForecastWeekRank - 1)
Duration: ItemsWeek.One
Samples : 1500)
```
I have no issue with the the forward looking forecast. I have, however, issue with the backward forecast test..... specifically with BacktestForecastWeekRank. It grows to 790 days, which is greater than what actionrwd can allow (365 days). The data set I have goes back to 2018. Would this be the cause?

s40racer 13 months | flag | on: Forecast Analysis - Forecast Quality

Thank you. I resolved the issue above by using
```envision
keep where Items.Sku in ForecastProduit.Sku
```
To make sure the SKUs in the items table matches the SKU in the ForecastProduit table.

s40racer 13 months | flag | on: Forecast Analysis - Forecast Quality

I did an output table to see the values in the Items and Itemsweek table


today = max(Sales.Date)
todayForecast = monday(today) + 7

Items.Amount365 = sum(Sales.LokadNetAmount) when (Date >= today - 365)
Items.Q365 = sum(Sales.DeliveryQty) when (Date >= today - 365)
Items.DisplayRank = rank()  scan Items.Q365

table ItemsWeek = cross(Items, Week)
ItemsWeek.Monday = monday(ItemsWeek.Week)
ItemsWeek.IsPast = single(ForecastProduit.IsPast) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.Baseline = single(ForecastProduit.Baseline) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.DemandQty = single(ForecastProduit.DemandQty) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.SmoothedDemandQty = single(ForecastProduit.SmoothedDemandQty) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.FutureWeekRank = single(ForecastProduit.FutureWeekRank) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
Items.Dispersion = same(ForecastProduit.Dispersion)

show table "items" with
  today
  todayForecast
  Items.Amount365 
  Items.Q365 
  Items.DisplayRank 
  ItemsWeek.Monday 
  ItemsWeek.IsPast 
  ItemsWeek.Baseline 
  ItemsWeek.DemandQty 
  ItemsWeek.SmoothedDemandQty 
  ItemsWeek.FutureWeekRank 
  Items.Dispersion 

show table "forecastproductit" with
  ForecastProduit.date
  ForecastProduit.Sku
  ForecastProduit.DemandQty
  ForecastProduit.Baseline
  ForecastProduit.Dispersion

and confirmed that there are quite a bit of data with dispersion value = 0 but this is not the case in the ForecastProduit table (as verified from the code output above). Any suggestions on what may cause the dispersion value to become 0?

Continuing from the previous comment - For the same SKU, the values for SeasonalityModel, Profile1, level changed between two runs on different days. I am unsure what caused the change in these values - the input data remained the same.

Code before dispersion:


ItemsWeek.ItemLife =  1
ItemsWeek.CumSumMinusOneExt = 0
where ItemsWeek.Monday >= firstDate
  ItemsWeek.CumSumMinusOneExt = (sum(ItemsWeek.ItemLife) by ItemsWeek.Sku scan ItemsWeek.Week) - 1
ItemsWeek.CumSumMinusOneExtMonth = ceiling(ItemsWeek.CumSumMinusOneExt / 8)
ItemsWeek.WeekNum = rank() by Items.Sku scan -monday(Week)
nbWeeks = same(ItemsWeek.WeekNum) when(ItemsWeek.Week == week(today()))
ItemsWeek.ItemLifeWeight = 0.3 + 1.2*(ItemsWeek.WeekNum/nbWeeks)^(1/3) 
ItemsWeek.IsCache = ItemsWeek.Monday >= firstDate and ItemsWeek.Monday < today
ItemsWeek.Cache = if ItemsWeek.IsCache then 1 else 0
expect table Items max 30000
expect table ItemsWeek max 5m
table YearWeek[YearWeek] = by ((Week.Week - week(firstDate)) mod 52)
Items.SeasonalityGroup = Items.Category
table Groups[SeasonalityGroup] = by Items.SeasonalityGroup
table SeasonYW max 1m = cross(Groups, YearWeek)
Items.Level = avg(ItemsWeek.DemandQty) when(ItemsWeek.Monday >= today - 365 and ItemsWeek.Monday < today)
Items.Level = if Items.Level == 0 then -10 else
              if log(Items.Level) < -10 then - 10 else
              if log(Items.Level) > 10 then 10 else
              log(Items.Level)
maxEpochs = 1000
autodiff Items epochs:maxEpochs learningRate:0.01 with
  params Items.Affinity1 in [0..] auto(0.5, 0.166)
  params Items.Affinity2 in [0..] auto(0.5, 0.166)
  params Items.Level in [-10..10]
  params Items.LevelShift in [-0.5..0.5] auto(0, 0)
  params SeasonYW.Profile1  in [0..1] auto(0.5, 0.1)
  params SeasonYW.Profile2  in [0..1] auto(0.5, 0.1)
  SumAffinity =
    Items.Affinity1 +Items.Affinity2
  YearWeek.SeasonalityModel = SeasonYW.Profile1  * Items.Affinity1 +SeasonYW.Profile2  * Items.Affinity2
  Week.LinearTrend = ItemsWeek.Cache + (ItemsWeek.CumSumMinusOneExtMonth * ItemsWeek.Cache * Items.LevelShift / 10)
  Week.Baseline = exp(Items.Level) * YearWeek.SeasonalityModel * ItemsWeek.Cache *  Week.LinearTrend
  Week.Coeff =  ItemsWeek.ItemLifeWeight
  Week.DeltaSquare = (Week.Baseline - ItemsWeek.SmoothedDemandQty) ^ 2

  Sum = sum(Week.Coeff * Week.DeltaSquare) / 10000
  SumPowAffinity = (Items.Affinity1 ^2 +
                    Items.Affinity2 ^2 ) /\
                    (SumAffinity ^2)
  return ( \
          // Core Loss Function
          (1 + Sum) / (SumPowAffinity))
table ItemsYW = cross(Items, YearWeek)
ItemsYW.SeasonalityGroup = Items.SeasonalityGroup
ItemsWeek.YearWeek = Week.YearWeek
ItemsYW.Profile1 = SeasonYW.Profile1
ItemsYW.Profile2 = SeasonYW.Profile2
ItemsYW.SeasonalityModel = ItemsYW.Profile1  * Items.Affinity1 +
                            ItemsYW.Profile2  * Items.Affinity2
ItemsWeek.LinearTrend =  max(0, 1 + (ItemsWeek.CumSumMinusOneExtMonth * Items.LevelShift/ 10))
ItemsWeek.Baseline = exp(Items.Level) * ItemsYW.SeasonalityModel * ItemsWeek.LinearTrend

Thank you ToLok.

Are you able to modify the code or give a more explicit example on how to implement the code at the SKU level? From data standpoint, I assume the following fields need to exist in items, PO, and vendor tables: item #, destination location, and supplier ID, in order to implement the SKU level code?

Currently the partnering data has not been updated to such a structure. Only the Items table (Item Master) has the item #, supplier ID, and destination location. If the data structure noted above is needed to implement the SKU level code, I can make sure this is done.

Thank you.

Thank you for the guidance. I am asking more from the code standpoint. The data is given with lead-time at the item-location level. I am thinking the easiest is to bring that data from Items table into the Vendors table to utilize the existing code.
With the existing code, I assume I need to add a location variable to the file to look something like:

Original:


read "/clean/tmp/Suppliers.ion" as Suppliers with
  Supplier : text
  Leadtime : number

Updated:


read "/clean/tmp/Suppliers.ion" as Suppliers with
  Supplier : text
  Location: text
  Leadtime : number

Then in any subsequent joins or filters, I will need to add the location filter. How would I update the following code to account for the location specific lead-time?


///Possible SLT layers depending on many datapoints can be found in the dataset
Items.SLT_ItemLevel = ranvar(PO.DeliveryDelay) when PO.IsClosed
Items.SLT_SupplierAndCategoryLevel = ranvar(PO.DeliveryDelay) by [Items.Supplier,Items.Category] when PO.IsClosed
Items.SLT_SupplierLevel = ranvar(PO.DeliveryDelay) by [Items.Supplier] when PO.IsClosed
Items.AnnouncedSLTValue = same(Suppliers.Leadtime) by Suppliers.Supplier at Items.Supplier

Taking the last line as an example, would it look something like ?


Items.AnnouncedSLTValue = same(Suppliers.Leadtime) by [Suppliers.Supplier, Suppliers.Location] at [Items.Supplier, Items.Location]