Motivation:
Even the most stable manufacturing processes will vary with regards to the quantity of items produced per time. Wouldn’t it be interesting to analyze your throughput over time, storing that data out of your ERP in a data warehouse? Below’s SQL shows you how to get there, from a SAP Bi point of view. Please note that there is no overall 100% correct and complete solution for this request, since all SAP ERP are corporate unique. But it should definitely guide you the right way.
Solution:
If you are looking for throughput times, AFKO is the central Sap table of interest. AFKO (German for Auftragskopf) stands for Order Header Data PP Orders. The order details are stored in AFPO and that’s where we get position number (posnr), start date of the planned order (strmp), delivery date from planned order (ltrmp), item (matnr) and plant (werks) from:
drop table [etl].[Throughput]
Select distinct a.aufnr, a.gstri, a.getri, a.dispo, a.fevor, a.gsuzi, a.geuzi, b.posnr, b.strmp, b.ltrmp, b.matnr, b.werks, a.gltri
into [etl].[Throughput]
From [AFKO] as a
inner join [afpo] as b
on a.aufnr=b.aufnr
To calculate the net actual throughput we have to measure the distance between actual start date (gstri) and confirmed order finish date (getri). Below you’ll find an example which only takes working days from monday until friday into consideration:
update [etl].[Throughput]
set ActThroughputNet= DATEDIFF(dd,gstri,getri) - (DATEDIFF(wk,gstri,getri) * 2) -
case when datepart(dw, gstri) = 1 then 1 else 0 end +
case when datepart(dw, getri) = 1 then 1 else 0 end
As a benchmark we use the same logic again, but this time strmp (planned start) and ltrmp (planned delivery date). That way we can compare actual vs. plan later on:
update [etl].[Throughput]
set PlanThroughput =DATEDIFF(dd,strmp,ltrmp) - (DATEDIFF(wk, strmp,ltrmp) * 2) -
case when datepart(dw, strmp) = 1 then 1 else 0 end +
case when datepart(dw, ltrmp) = 1 then 1 else 0 end
Until now, we have only calculated the date difference in days (excluding saturday and sunday). We’ll now take care of the time difference as well. The field gsuzi stands for the actual start time. Confirmed order finish time is stored in field geuzi. If we add that time difference to the date difference we have already computed (ActThroughputNet) we’re done. The rest is only usual time formatting, e.g. showing the difference in minutes or hours:
update [etl].[Throughput]
set
gsuzitime= (select left(gsuzi,2) + ':' + substring(gsuzi,3,2) + ':' + right(gsuzi,2)),
geuzitime= (select left(geuzi,2) + ':' + substring(geuzi,3,2) + ':' + right(geuzi,2))
update [etl].[Throughput]
set ThroughputNetTimeMinutes =(
case
when
- (gsuzitime>geuzitime) then 1440+datediff (mi,gsuzitime,geuzitime)
(geuzitime> gsuzitime) then datediff (HOUR,gsuzitime,geuzitime)
else datediff (HOUR, geuzitime, gsuzitime)
end
)
update [etl].[Throughput]
set ActThroughputNetMeasureHours= (case when gsuzitime > geuzitime then ActThroughputNet *24 - ThroughputNetTimeMinutes else ActThroughputNet *24 + ThroughputNetTimeMinutes end)
In addition we could now map the production order status. Since we do not want to take canceled production orders into consideration, we’ll only take those rows from table JEST which are active (inact <>’X’):
update [etl].[Throughput]
set JestStatActive= 'WhatEverYourCompany'
from [etl].[Throughput] as a
inner join [JEST] as b
on concat('OR', a.aufnr) = b.OBJNR
where b.inact<>'X' and stat ='WhatEverYourCompany'
Production supervisor (fevor) is the key to join department details from T024F’s fevtxt:
update [etl].[Throughput]
set Bereich = b.FEVTXT
from [etl].[Throughput] as a
Inner join [T024F] as b
On a.fevor=b.fevor
Finally we can do all Throughput calculations as we wish, e.g.:
drop table [etl].[ThroughputTop]
select matnr, avg(ActThroughputNetMeasureHours)/24 as AvgDurationPerItem
into [etl].[ThroughputTop]
from [etl].[Throughput] as a
group by matnr
order by AvgDurationPerProdOrder
Congratulations:
We have just calculated Throughput (or Lead Time) from our ERP system SAP. Many thanks for reading, hope this was supportive! Any questions, please let me know. You can connect with me on LinkedIn or Twitter.
Originally published on my website DAR-Analytics.