Hello,
I have a variation of forecast analysis where Demand Forecast at the SKU level has been generated and information stored in a file called ForecastProduit. The data in ForecastProduit is read into this implementation to check on the quality of the forecast. The ForecastProduit has information such as SKU, dispersion, alpha, seasonality group, demandqty, etc.
I verified that the minimum dispersion value in ForecastProduit is 1. However in running the code below, I get an error stating the dispersion value = 0 in the actionrwd.segment function.
today = max(Sales.Date)
todayForecast = monday(today) + 7
Items.Amount365 = sum(Sales.LokadNetAmount) when (Date >= today - 365)
Items.Q365 = sum(Sales.DeliveryQty) when (Date >= today - 365)
Items.DisplayRank = rank() scan Items.Q365
table ItemsWeek = cross(Items, Week)
ItemsWeek.Monday = monday(ItemsWeek.Week)
ItemsWeek.IsPast = single(ForecastProduit.IsPast) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.Baseline = single(ForecastProduit.Baseline) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.DemandQty = single(ForecastProduit.DemandQty) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.SmoothedDemandQty = single(ForecastProduit.SmoothedDemandQty) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
ItemsWeek.FutureWeekRank = single(ForecastProduit.FutureWeekRank) by [ForecastProduit.Sku,ForecastProduit.Date] at [Items.Sku,ItemsWeek.Monday]
Items.Dispersion = same(ForecastProduit.Dispersion)
quantileLow1 = 0.3
quantileLow2 = 0.05
quantileHigh1 = 0.7
quantileHigh2 = 0.95
ItemsWeek.One = dirac(1)
ItemsWeek.Demand = dirac(0)
where ItemsWeek.FutureWeekRank > 0
ItemsWeek.Demand = actionrwd.segment(
TimeIndex: ItemsWeek.FutureWeekRank
BaseLine: ItemsWeek.Baseline
Dispersion: Items.Dispersion
Alpha: 0.05
Start: dirac(ItemsWeek.FutureWeekRank - 1)
Duration: ItemsWeek.One
I did an output table to see the values in the Items and Itemsweek table
and confirmed that there are quite a bit of data with dispersion value = 0 but this is not the case in the ForecastProduit table (as verified from the code output above). Any suggestions on what may cause the dispersion value to become 0?
I suspect its the behavior of the
same
aggregator when facing an empty set which defaults to zero, see my snippet below:Try it at https://try.lokad.com/s/same-defaults-to-zero
Hope it helps.
Thank you. I resolved the issue above by using
```envision
keep where Items.Sku in ForecastProduit.Sku
```
To make sure the SKUs in the items table matches the SKU in the ForecastProduit table.
Now I encounter another issue. The code below follows what I posted initially.
```envision
// ///Export
quantileLow1 = 0.3
quantileLow2 = 0.05
quantileHigh1 = 0.7
quantileHigh2 = 0.95
ItemsWeek.One = dirac(1)
ItemsWeek.Demand = dirac(0)
where ItemsWeek.FutureWeekRank > 0
ItemsWeek.Demand = actionrwd.segment(
TimeIndex: ItemsWeek.FutureWeekRank
BaseLine: ItemsWeek.Baseline
Dispersion: Items.Dispersion
Alpha: 0.05
Start: dirac(ItemsWeek.FutureWeekRank - 1)
Duration: ItemsWeek.One
Samples : 1500)
// ////BackTest Demand
keep where min(ItemsWeek.Baseline) when (ItemsWeek.Baseline > 0) by Items.Sku >= 1
ItemsWeek.One=dirac(1)
ItemsWeek.BacktestForecastWeekRank = 0
where ItemsWeek.IsPast
ItemsWeek.BacktestForecastWeekRank = rank() by Items.Sku scan - ItemsWeek.Monday
keep where ItemsWeek.BacktestForecastWeekRank >0 and ItemsWeek.BacktestForecastWeekRank < 371
where ItemsWeek.BacktestForecastWeekRank > 0
ItemsWeek.BackTestDemand = actionrwd.segment(
TimeIndex: ItemsWeek.BacktestForecastWeekRank
BaseLine: ItemsWeek.Baseline
Dispersion: Items.Dispersion
Alpha: 0.05
Start: dirac(ItemsWeek.BacktestForecastWeekRank - 1)
Duration: ItemsWeek.One
Samples : 1500)
```
I have no issue with the the forward looking forecast. I have, however, issue with the backward forecast test..... specifically with BacktestForecastWeekRank. It grows to 790 days, which is greater than what actionrwd can allow (365 days). The data set I have goes back to 2018. Would this be the cause?