Calculate actual downtime, ignoring date / time match
I am trying to figure out how to calculate the actual downtime for various applications from the data I store in a table.
At the moment I'm just calculating the difference between DowntimeStart and DowntimeEnd, which is shown in DowntimeMinutes.
The problem is that if there is a timing switch as the individual components are omitted, it should consider general overflow ignoring.
Expected in the Expected column.
Any ideas on how the query can be combined to achieve this?
Application DowntimeStart DowntimeEnd DowntimeMinutes Expected
Application Demo 2014-11-20 17:31:01.467 2014-11-20 18:01:01.243 30 30
Application Demo 2014-11-28 17:59:00.987 2014-11-28 18:09:02.167 10 26
Application Demo 2014-11-28 18:00:01.403 2014-11-28 18:25:01.443 25 0
Application Demo 2014-11-29 19:13:08.580 2014-11-30 05:30:01.763 617 617
Application Demo 2014-11-30 01:55:01.953 2014-11-30 03:54:01.730 119 0
I took a look and researched these options but they didn't achieve the above:
Find Total Minutes Ignore Overlap (Transform based on cursor response to CTE)
SQL to find time elapsed with multiple overlap intervals
http://www.experts-exchange.com/Database/MS-SQL-Server/SQL_Server_2008/Q_28169653.html
http://thehobt.blogspot.com.au/2009/04/calculating-elapsed-time-based-upon.html
source to share
UPDATED WITH NEW CHALLENGES
Here is one method that calculates unique trips and then flattens them to an initial outage time, resulting in trips to match actual and expected values.
DECLARE @Downtime TABLE (
ID INT PRIMARY KEY NOT NULL IDENTITY(1,1),
Application VARCHAR(25),
DowntimeStart DATETIME,
DowntimeEnd DATETIME,
Expected INT
)
INSERT @Downtime (Application, DowntimeStart, DowntimeEnd, Expected) VALUES -- Act/Exp
('Application Demo', '2014-11-20 17:31:01.467', '2014-11-20 18:01:01.243', 30) -- 30/30
,('Application Demo', '2014-11-28 17:59:00.987', '2014-11-28 18:09:02.167', 26) -- 10/26
,('Application Demo', '2014-11-28 18:00:01.403', '2014-11-28 18:25:01.443', 0) -- 25/0
,('Application Demo', '2014-11-29 19:13:08.580', '2014-11-30 05:30:01.763', 617) -- 617/617
,('Application Demo', '2014-11-30 01:55:01.953', '2014-11-30 03:54:01.730', 0)
,('Application Demo 2', '2014-12-19 23:09:01.303', '2014-12-22 09:43:01.397', 3514)
,('Application Demo 2', '2014-12-19 23:09:01.303', '2014-12-22 09:43:01.397', 0)
,('Application Demo 2', '2014-12-19 23:09:01.303', '2014-12-22 09:43:01.397', 0)
,('Application Demo 2', '2014-12-19 23:09:01.303', '2014-12-22 09:43:01.397', 0)
,('Application Demo 2', '2014-12-19 23:09:01.303', '2014-12-22 09:43:01.397', 0)
SELECT
Downtimes.Application,
Downtimes.DowntimeStart,
Downtimes.DowntimeEnd,
Downtimes.Expected,
COALESCE(Actual, 0) AS Actual
FROM @Downtime Downtimes
LEFT OUTER JOIN (
SELECT DISTINCT
D1.Application,
MIN(CASE WHEN D1.DowntimeStart < D2.DowntimeStart THEN D1.ID ELSE D2.ID END) AS [ID],
MIN(CASE WHEN D1.DowntimeStart < D2.DowntimeStart THEN D1.DowntimeStart ELSE D2.DowntimeStart END) AS [DowntimeStart],
MAX(CASE WHEN D1.DowntimeEnd > D2.DowntimeEnd THEN D1.DowntimeEnd ELSE D2.DowntimeEnd END) AS [DowntimeEnd],
DATEDIFF(MINUTE,
MIN(CASE WHEN D1.DowntimeStart < D2.DowntimeStart THEN D1.DowntimeStart ELSE D2.DowntimeStart END),
MAX(CASE WHEN D1.DowntimeEnd > D2.DowntimeEnd THEN D1.DowntimeEnd ELSE D2.DowntimeEnd END)) AS Actual
FROM @Downtime D1
INNER JOIN @Downtime D2
ON D1.Application = D2.Application
AND (D1.DowntimeStart BETWEEN D2.DowntimeStart AND D2.DowntimeEnd
OR D2.DowntimeStart BETWEEN D1.DowntimeStart AND D1.DowntimeEnd)
GROUP BY
D1.Application,
D1.DowntimeStart
) Outages
ON Outages.ID = Downtimes.ID
And this gives the desired output:
Application DowntimeStart DowntimeEnd Expected Actual
------------------------- ----------------------- ----------------------- ----------- -----------
Application Demo 2014-11-20 17:31:01.467 2014-11-20 18:01:01.243 30 30
Application Demo 2014-11-28 17:59:00.987 2014-11-28 18:09:02.167 26 26
Application Demo 2014-11-28 18:00:01.403 2014-11-28 18:25:01.443 0 0
Application Demo 2014-11-29 19:13:08.580 2014-11-30 05:30:01.763 617 617
Application Demo 2014-11-30 01:55:01.953 2014-11-30 03:54:01.730 0 0
Application Demo 2 2014-12-19 23:09:01.303 2014-12-22 09:43:01.397 3514 3514
Application Demo 2 2014-12-19 23:09:01.303 2014-12-22 09:43:01.397 0 0
Application Demo 2 2014-12-19 23:09:01.303 2014-12-22 09:43:01.397 0 0
Application Demo 2 2014-12-19 23:09:01.303 2014-12-22 09:43:01.397 0 0
Application Demo 2 2014-12-19 23:09:01.303 2014-12-22 09:43:01.397 0 0
source to share
I had a similar problem and got an answer to my question How to consolidate blocks of time?
In your case, this is accomplished with the top 1 self outer applied to get the overlap, and then use either the overlap start time or null, the normal write end time as the end time.
CREATE TABLE Downtime (
Application VARCHAR(25),
DowntimeStart DATETIME,
DowntimeEnd DATETIME,
Expected INT
)
INSERT Downtime (Application, DowntimeStart, DowntimeEnd, Expected) VALUES -- Act/Exp
('Application Demo', '2014-11-20 17:31:01.467', '2014-11-20 18:01:01.243', 30) -- 30/30
,('Application Demo', '2014-11-28 17:59:00.987', '2014-11-28 18:09:02.167', 26) -- 10/26
,('Application Demo', '2014-11-28 18:00:01.403', '2014-11-28 18:25:01.443', 0) -- 25/0
,('Application Demo', '2014-11-29 19:13:08.580', '2014-11-30 05:30:01.763', 617) -- 617/617
,('Application Demo', '2014-11-30 01:55:01.953', '2014-11-30 03:54:01.730', 0)
SELECT
Records.Application, Records.DowntimeStart, Records.DowntimeEnd, Records.Expected
, DATEDIFF(minute, Records.DowntimeStart, COALESCE(Overlap.DowntimeStart, Records.DowntimeEnd)) AS Actual
-- , Overlap.Application, Overlap.DowntimeStart, Overlap.DowntimeEnd -- For Verification Purposes
FROM Downtime Records
OUTER APPLY (
SELECT TOP 1 Overlap.Application, Overlap.DowntimeStart, Overlap.DowntimeEnd
FROM Downtime Overlap
WHERE Records.Application = Overlap.Application
AND Overlap.DowntimeStart > Records.DowntimeStart
AND Overlap.DowntimeStart BETWEEN Records.DowntimeStart AND Records.DowntimeEnd
ORDER BY Overlap.DowntimeStart
) Overlap
Here's a SQLFiddle with a solution.
source to share