T-SQL Grouping Set Showcase

Greetings all,

In the interim, whilst I juggle my time to get more in-depth posts rolled out (like the one surrounding Alexa skills), I back-tracked once again to look at overlooked features from the distant past of SQL 2008. I’ve chosen the topic of GROUPING SETS as the basis of this post; which basically boils down to a really nice way to return multiple levels of aggregation in a single result set.

A quick run through then – the following example illustrates a sample database with four tables; Staff, Perk, HolidayRequest and SalaryPayment. The HolidayRequest table has a computed column (based on ‘DateFrom’ and ‘DateTo’) that specifies how many days holiday have been requested (per request) and the SalaryPayment table has an ‘Amount’ column (salary per month). These represent our key aggregation fields. I haven’t actually referenced all columns in the samples provided; I just wanted to give the examples a little bit of extra padding (so they feel more complete).

The following T-SQL shows aggregations targeting total holiday days requested grouped by staff name, department and department and job title (see the GetHolidayRequestDaysSumInfo stored procedure in the full listing for further details):

-- Holiday request days (summed), grouped by staff name (two daves are cobbled together here)
SELECT 
s.FirstName + ' ' + s.LastName AS [StaffName]
, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
FROM dbo.Staff s
	INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
GROUP BY s.FirstName + ' ' + s.LastName;

-- Holiday request days (summed), grouped by department
SELECT 
s.Department
, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
FROM dbo.Staff s
	INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
GROUP BY Department;

-- Holiday request days (summed), grouped by department and job title
SELECT
s.Department
, s.JobTitle
, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
FROM dbo.Staff s
	INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
GROUP BY s.Department, s.JobTitle;

The results look like this:

Holiday Request Group Queries.

Holiday Request Group Queries.

With traditional GROUP BY statements you need to split aggregations, so you can only perform one per result set. This is where GROUPING SETS come in handy; here is a single statement that returns the results from the statements above, but in a single result set (again, see the GetHolidayRequestDaysSumInfoUsingGroupingSets for more information, within the full listing):

-- Holiday request days (summed), group sets included for staff name, department and department and job title
SELECT 
s.FirstName + ' ' + s.LastName AS [StaffName]
, s.Department
, s.JobTitle
, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
FROM dbo.Staff s
	INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
GROUP BY GROUPING SETS
(
	(s.FirstName + ' ' + s.LastName)
	, (s.Department)
	, (s.Department, s.JobTitle)
);

Again, here are the super fabulous results from this particular query:

Holiday Request Grouping Set Queries.

Holiday Request Grouping Set Queries.

Pretty neat! Just to nail the point home here are a few extra examples focusing on the summing of salary payments, grouped by various facets (individual statements, for starters, as shown in the GetSalarySumInfo procedure):

-- Salary sum info grouped by staff name (two daves are, of course, grouped by this)
SELECT 
s.FirstName + ' ' + s.LastName AS [StaffName]
, SUM(sp.Amount) AS [TotalPay]
FROM dbo.Staff s
	INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
GROUP BY s.FirstName + ' ' + s.LastName;

-- Salary sum info grouped by job title
SELECT 
s.JobTitle
, SUM(sp.Amount) AS [TotalPay]
FROM dbo.Staff s
	INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
GROUP BY s.JobTitle;

-- Salary sum info grouped by department
SELECT 
s.Department
, SUM(sp.Amount) AS [TotalPay]
FROM dbo.Staff s
	INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
GROUP BY s.Department

-- Salary sum info grouped by perk name
SELECT 
p.PerkName
, SUM(sp.Amount) AS [TotalPay]
FROM dbo.Staff s
	INNER JOIN dbo.Perk p ON s.Perk = p.Id
	INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
GROUP BY p.PerkName;

-- Salary sum info grouped by department and job title
SELECT
s.Department
, s.JobTitle
, SUM(sp.Amount) AS [TotalPay]
FROM dbo.Staff s
	INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
GROUP BY s.Department, s.JobTitle;

Results from our individual statements look a little like this:

Salary Group Queries.

Salary Group Queries.

Lastly, here’s the equivalent, single, query using GROUPING SETS (see GetSalarySumInfoUsingGroupingSets in the full listing):

-- Salary sum info grouped by staff name, department, job title, perk name and department/job title, in a single result set
SELECT 
s.FirstName + ' ' + s.LastName AS [StaffName]
, s.Department
, s.JobTitle
, p.PerkName
, SUM(sp.Amount) AS [TotalPay]
FROM dbo.Staff s
	INNER JOIN dbo.Perk p ON s.Perk = p.Id
	INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
GROUP BY GROUPING SETS
(
	(s.FirstName + ' ' + s.LastName)
	, (s.JobTitle)
	, (s.Department)
	, (p.PerkName)
	, (s.Department, s.JobTitle)
);
Salary Grouping Set Queries.

Salary Grouping Set Queries.

A great little language feature that I have largely overlooked in the past. I do have one or two reservations over performance, but a first inspection of the query plan for the ‘salary’ based example suggests that some marginal gains are on the table, using the stock setup I have (note the table spool, also):

Grouping Sets Performance.

Grouping Sets Performance.

Feels to me like something that deserves further investigation.

Here is the complete listing, if you wish to create the tables with the sample data referenced in this post (procedures, and all, bundled in). Thanks for tuning in!

/* 

	Lewis Grint - 18/03/2017

	GROUPING SETS WORKING SAMPLE 
	-------------------------------------------------
	1) This example uses four mock database tables (created internally by the script, no work required :op). Run this on a test database of your choice.
		- Staff.
		- Perk.
		- HolidayRequest.
		- SalaryPayment.
	2) Utility, mock data is also inserted by this script (table drop statements, if required, can be found just below this header).
	3) The sample data is then aggregated using GROUP BY via two traditionally structured stored procedures (GetHolidayRequestDaysSumInfo and GetSalarySumInfo). Data is returned (for each grouping) using a separate result set.
	4) Stored procedure variants are then presented that using GROUPING SETS - All GROUP BY variations are returned in a single result set (for illustration purposes only, not saying my grouping selections are actually that helpful in the real world!).

*/

/*
-- Drop table statements, as and when required
DROP TABLE dbo.HolidayRequest;
DROP TABLE dbo.SalaryPayment;
DROP TABLE dbo.Staff;
DROP TABLE dbo.Perk;

*/

-- Habitual setting of preferred QUOTED_IDENTIFIER/ANSI_NULL configuration (even if it's not important for this example)
SET QUOTED_IDENTIFIER ON;
GO
SET ANSI_NULLS ON;
GO

-- Utility variables for gathering up the correct StaffId, when required (and payment dates)
DECLARE
@StaffId							INT
, @FebPayDate						DATE = '2017-02-28'
, @JanPayDate						DATE = '2017-01-31'
, @DecPayDate						DATE = '2016-12-31';

-- 1) Create sample tables to illustrate how GROUPING SETS work

-- Perk table added for nothing more than extra kicks. A staff member can select a company perk (referenced from the Staff table)
IF (OBJECT_ID('dbo.Perk', 'U') IS NULL)
BEGIN

	CREATE TABLE dbo.Perk
	(
		Id							INT				PRIMARY KEY		IDENTITY(1, 1)	NOT NULL
		, PerkName					NVARCHAR(30)									NOT NULL
		, INDEX IX_Perk_PerkName	NONCLUSTERED (PerkName)					
	);

END;

-- NOTE: I won't be including more indexes from this point forward, to try and keep the example light(er)

-- Staff table, which for the sake of our example contains first name, last name, job title, department and a fk to 'perks'
IF (OBJECT_ID('dbo.Staff', 'U') IS NULL)
BEGIN

	CREATE TABLE dbo.Staff
	(
		Id							INT				PRIMARY KEY		IDENTITY(1, 1)	NOT NULL
		, FirstName					NVARCHAR(50)									NOT NULL
		, LastName					NVARCHAR(50)									NOT NULL
		, JobTitle					NVARCHAR(50)									NOT NULL
		, Department				NVARCHAR(50)									NOT NULL
		, Perk						INT												NOT NULL
		, CONSTRAINT FK_Staff_Perk	FOREIGN KEY (Perk) REFERENCES dbo.Perk (Id)
	);

END;

-- SalaryPayment table, which contains multiple rows for each staff member (one for each salary payment)
IF (OBJECT_ID('dbo.SalaryPayment', 'U') IS NULL)
BEGIN

	CREATE TABLE dbo.SalaryPayment
	(
		Id							INT				PRIMARY KEY		IDENTITY(1, 1)	NOT NULL
		, StaffId					INT												NOT NULL
		, PaymentDate				DATE											NOT NULL
		, Amount					MONEY
		, CONSTRAINT FK_SalaryPayment_Staff			FOREIGN KEY (StaffId) REFERENCES dbo.Staff (Id)
	);

END;

-- HolidayRequest table, which contains multiple rows for each staff member (one for each holiday request)
IF (OBJECT_ID('dbo.HolidayRequest', 'U') IS NULL)
BEGIN

	CREATE TABLE dbo.HolidayRequest
	(
		Id							INT				PRIMARY KEY		IDENTITY(1, 1)	NOT NULL
		, StaffId					INT												NOT NULL
		, DateFrom					DATE											NOT NULL
		, DateTo					DATE											NOT NULL
		-- Final column is computed based on the difference between the start and end date (requested), in days. Simple, but fits for the purposes of the example (obviously, doesn't take into account weekends, bank holiday, etc.)
		, NumberOfDaysRequested		AS (DATEDIFF(DAY, DateFrom, DateTo) + 1)
		, CONSTRAINT FK_HolidayRequest_Staff		FOREIGN KEY (StaffId) REFERENCES dbo.Staff (Id)
	);

END;

--2) Insert test data into each table

-- Perk table test data
IF NOT EXISTS
(
	SELECT *
	FROM dbo.Perk p
)
BEGIN

	INSERT INTO dbo.Perk
	(
		PerkName
	)
	VALUES 
	(
		'Free ice cream'
	)
	,
	(
		'Free parking'
	)
	,
	(
		'Beer on arrival'
	)
	,
	(
		'Pat on the back'
	);

END;

-- Staff table test data (including perk info)
IF NOT EXISTS
(
	SELECT *
	FROM dbo.Staff s
)
BEGIN

	INSERT INTO dbo.Staff
	(
		FirstName
		, LastName
		, JobTitle
		, Department
		, Perk	
	)
	VALUES
	(
		'Steve'
		, 'Stevenson'
		, 'Head Honcho'
		, 'Ivory Tower'
		, 
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Free ice cream'
		)
	)
	,
	(
		'Marie'
		, 'Pritchard'
		, 'Team Manager'
		, 'Ivory Tower'
		, 
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Free parking'
		)
	)
	,
	(
		'Judy'
		, 'Dench'
		, 'Team Manager'
		, 'Island Retreat'
		, 
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Free ice cream'
		)
	)
	,
	(
		'Dave'
		, 'Dodger'
		, 'Chief Work Dodger'
		, 'Store Cupboard'
		, 
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Beer on arrival'
		)
	)
	-- There's another Dave Dodger at the company, but he is actually a pretty useful chap
	,
	(
		'Dave'
		, 'Dodger'
		, 'Hard Worker'
		, 'Store Cupboard'
		, 
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Pat on the back'
		)
	)
	,
	(
		'Bob'
		, 'Boots'
		, 'Handle Cranker'
		, 'Main Office'
		,
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Pat on the back'
		)
	)
	,
	(
		'Janet'
		, 'Timms'
		, 'Handle Cranker'
		, 'Main Office'
		, 
		(
			SELECT 
			Id 
			FROM dbo.Perk p 
			WHERE p.PerkName = 'Pat on the back'
		)
	);

END;

-- SalaryPayment table test data
IF NOT EXISTS
(
	SELECT *
	FROM dbo.SalaryPayment sp
)
BEGIN
	
	-- Steve Stevenson | Head Honcho
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Steve'
		AND s.LastName = 'Stevenson'
		AND s.JobTitle = 'Head Honcho'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 5580.50
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 5240.50
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 5580.50
	);

	-- Marie Pritchard | Team Manager
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Marie'
		AND s.LastName = 'Pritchard'
		AND s.JobTitle = 'Team Manager'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 2500.75
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 2425.15
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 2425.15
	);

	-- Judy Dench | Team Manager
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Judy'
		AND s.LastName = 'Dench'
		AND s.JobTitle = 'Team Manager'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 2495.75
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 2400.15
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 2400.15
	);

	-- Dave Dodger | Chief Work Dodger
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Dave'
		AND s.LastName = 'Dodger'
		AND s.JobTitle = 'Chief Work Dodger'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 2122.90
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 2105.20
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 2105.20
	);

	-- Dave Dodger | Hard Worker
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Dave'
		AND s.LastName = 'Dodger'
		AND s.JobTitle = 'Hard Worker'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 2115.50
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 2100.50
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 2100.50
	);

	-- Bob Boots | Handle Cranker
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Bob'
		AND s.LastName = 'Boots'
		AND s.JobTitle = 'Handle Cranker'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 2100.00
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 2039.50
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 2039.50
	);

	-- Janet Timms | Handle Cranker
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Janet'
		AND s.LastName = 'Timms'
		AND s.JobTitle = 'Handle Cranker'
	);

	INSERT INTO dbo.SalaryPayment
	(
		StaffId
		, PaymentDate
		, Amount
	)
	VALUES
	(
		@StaffId
		, @DecPayDate
		, 2100.00
	)
	,
	(
	
		@StaffId
		, @JanPayDate
		, 2039.50
	)
	,
	(
	
		@StaffId
		, @FebPayDate
		, 2039.50
	);
	
END;

-- HolidayRequest table test data
IF NOT EXISTS
(
	SELECT *
	FROM dbo.HolidayRequest hr
)
BEGIN

	-- For ease, everyone has three holiday requests
	
	-- Steve Stevenson | Head Honcho
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Steve'
		AND s.LastName = 'Stevenson'
		AND s.JobTitle = 'Head Honcho'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-01-30'
		, '2017-02-03'
	)
	,
	(
		@StaffId
		, '2017-05-22'
		, '2017-05-26'
	)
	,
	(
		@StaffId
		, '2017-07-24'
		, '2017-07-28'
	);

	-- Marie Pritchard | Team Manager
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Marie'
		AND s.LastName = 'Pritchard'
		AND s.JobTitle = 'Team Manager'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-03-23'
		, '2017-03-24'
	)
	,
	(
		@StaffId
		, '2017-06-12'
		, '2017-06-16'
	)
	,
	(
		@StaffId
		, '2017-08-14'
		, '2017-08-14'
	);

	-- Judy Dench | Team Manager
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Judy'
		AND s.LastName = 'Dench'
		AND s.JobTitle = 'Team Manager'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-03-24'
		, '2017-03-25'
	)
	,
	(
		@StaffId
		, '2017-06-12'
		, '2017-06-15'
	)
	,
	(
		@StaffId
		, '2017-08-15'
		, '2017-08-16'
	);

	-- Dave Dodger | Chief Work Dodger
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Dave'
		AND s.LastName = 'Dodger'
		AND s.JobTitle = 'Chief Work Dodger'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-03-20'
		, '2017-03-24'
	)
	,
	(
		@StaffId
		, '2017-03-27'
		, '2017-03-31'
	)
	,
	(
		@StaffId
		, '2017-04-03'
		, '2017-04-07'
	);

	-- Dave Dodger | Hard Worker
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Dave'
		AND s.LastName = 'Dodger'
		AND s.JobTitle = 'Hard Worker'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-03-31'
		, '2017-03-31'
	)
	,
	(
		@StaffId
		, '2017-07-31'
		, '2017-07-31'
	)
	,
	(
		@StaffId
		, '2017-09-25'
		, '2017-09-25'
	);

	-- Bob Boots | Handle Cranker
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Bob'
		AND s.LastName = 'Boots'
		AND s.JobTitle = 'Handle Cranker'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-05-15'
		, '2017-05-19'
	)
	,
	(
		@StaffId
		, '2017-07-10'
		, '2017-07-12'
	)
	,
	(
		@StaffId
		, '2017-10-11'
		, '2017-10-13'
	);

	-- Janet Timms | Handle Cranker
	SELECT
	@StaffId = 
	s.Id
	FROM dbo.Staff s
	WHERE 
	(
		s.FirstName = 'Janet'
		AND s.LastName = 'Timms'
		AND s.JobTitle = 'Handle Cranker'
	);

	INSERT INTO dbo.HolidayRequest
	(
		StaffId
		, DateFrom
		, DateTo
	)
	VALUES
	(
		@StaffId
		, '2017-04-10'
		, '2017-04-11'
	)
	,
	(
		@StaffId
		, '2017-06-21'
		, '2017-06-23'
	)
	,
	(
		@StaffId
		, '2017-11-13'
		, '2017-11-17'
	);

END;
GO

-- Create utility stored procedures, just because :op

-- Basic example using the HolidayRequest table
IF (OBJECT_ID('dbo.GetHolidayRequestDaysSumInfo', 'P') IS NOT NULL)
BEGIN

	DROP PROCEDURE dbo.GetHolidayRequestDaysSumInfo;

END;
GO

CREATE PROCEDURE dbo.GetHolidayRequestDaysSumInfo
AS
BEGIN

	-- Holiday request days (summed), grouped by staff name (two daves are cobbled together here)
	SELECT 
	s.FirstName + ' ' + s.LastName AS [StaffName]
	, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
	FROM dbo.Staff s
		INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
	GROUP BY s.FirstName + ' ' + s.LastName;

	-- Holiday request days (summed), grouped by department
	SELECT 
	s.Department
	, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
	FROM dbo.Staff s
		INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
	GROUP BY Department;

	-- Holiday request days (summed), grouped by department and job title
	SELECT
	s.Department
	, s.JobTitle
	, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
	FROM dbo.Staff s
		INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
	GROUP BY s.Department, s.JobTitle;

END;
GO

IF (OBJECT_ID('dbo.GetHolidayRequestDaysSumInfoUsingGroupingSets', 'P') IS NOT NULL)
BEGIN

	DROP PROCEDURE dbo.GetHolidayRequestDaysSumInfoUsingGroupingSets;

END;
GO

CREATE PROCEDURE dbo.GetHolidayRequestDaysSumInfoUsingGroupingSets
AS
BEGIN

	-- Holiday request days (summed), group sets included for staff name, department and department and job title
	SELECT 
	s.FirstName + ' ' + s.LastName AS [StaffName]
	, s.Department
	, s.JobTitle
	, SUM(hr.NumberOfDaysRequested) AS [TotalDaysRequested]
	FROM dbo.Staff s
		INNER JOIN dbo.HolidayRequest hr ON s.Id = hr.StaffId
	GROUP BY GROUPING SETS
	(
		(s.FirstName + ' ' + s.LastName)
		, (s.Department)
		, (s.Department, s.JobTitle)
	);

END;
GO

-- A couple more stored prodecures, focusing on the SalaryPayment table this time
IF (OBJECT_ID('dbo.GetSalarySumInfo', 'P') IS NOT NULL)
BEGIN

	DROP PROCEDURE dbo.GetSalarySumInfo;

END;
GO

CREATE PROCEDURE dbo.GetSalarySumInfo
AS
BEGIN

	-- Salary sum info grouped by staff name (two daves are, of course, grouped by this)
	SELECT 
	s.FirstName + ' ' + s.LastName AS [StaffName]
	, SUM(sp.Amount) AS [TotalPay]
	FROM dbo.Staff s
		INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
	GROUP BY s.FirstName + ' ' + s.LastName;

	-- Salary sum info grouped by job title
	SELECT 
	s.JobTitle
	, SUM(sp.Amount) AS [TotalPay]
	FROM dbo.Staff s
		INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
	GROUP BY s.JobTitle;

	-- Salary sum info grouped by department
	SELECT 
	s.Department
	, SUM(sp.Amount) AS [TotalPay]
	FROM dbo.Staff s
		INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
	GROUP BY s.Department

	-- Salary sum info grouped by perk name
	SELECT 
	p.PerkName
	, SUM(sp.Amount) AS [TotalPay]
	FROM dbo.Staff s
		INNER JOIN dbo.Perk p ON s.Perk = p.Id
		INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
	GROUP BY p.PerkName;

	-- Salary sum info grouped by department and job title
	SELECT
	s.Department
	, s.JobTitle
	, SUM(sp.Amount) AS [TotalPay]
	FROM dbo.Staff s
		INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
	GROUP BY s.Department, s.JobTitle;

END;
GO

IF (OBJECT_ID('dbo.GetSalarySumInfoUsingGroupingSets', 'P') IS NOT NULL)
BEGIN

	DROP PROCEDURE dbo.GetSalarySumInfoUsingGroupingSets;

END;
GO

CREATE PROCEDURE dbo.GetSalarySumInfoUsingGroupingSets
AS
BEGIN

	-- Salary sum info grouped by staff name, department, job title, perk name and department/job title, in a single result set
	SELECT 
	s.FirstName + ' ' + s.LastName AS [StaffName]
	, s.Department
	, s.JobTitle
	, p.PerkName
	, SUM(sp.Amount) AS [TotalPay]
	FROM dbo.Staff s
		INNER JOIN dbo.Perk p ON s.Perk = p.Id
		INNER JOIN dbo.SalaryPayment sp ON s.Id = sp.StaffId
	GROUP BY GROUPING SETS
	(
		(s.FirstName + ' ' + s.LastName)
		, (s.JobTitle)
		, (s.Department)
		, (p.PerkName)
		, (s.Department, s.JobTitle)
	);

END;
GO

-- Execute stored procedures to inspect the results

-- Holiday aggregation results (grouping sets stored procedure variant returns all data in a single result set)
EXEC dbo.GetHolidayRequestDaysSumInfo;
EXEC dbo.GetHolidayRequestDaysSumInfoUsingGroupingSets;

-- Salary aggregation results (grouping sets stored procedure variant returns all data in a single result set)
EXEC dbo.GetSalarySumInfo;
EXEC dbo.GetSalarySumInfoUsingGroupingSets;

OpenCover UI – Unit Test Code Coverage

A little sideline post to tide everyone over (as I’m still working on the Alexa piece, which I want to do proper justice to when it’s released). I’ve been messing around with a few rough and ready projects and wanted to get an idea of how to dig into code coverage, in respect of Unit Tests.

I’m currently using Visual Studio 2015 Community Edition and from what I gather no built-in support exists for non-enterprise editions, at the moment. The first hit I found was for the OpenCover UI extension; so I thought I’d take it for a spin to see what it’s made of:

Stack Overflow OpenCover UI Mention

Just so that you can get a feel for where I am at, here is an image outlining a home-brew project that shows some Unit Tests in play:

Unit Test Structure.

Unit Test Structure.

Nothing too miraculous here, I’m just using the standard Unit Testing framework and a little Moq for kicks. To follow this up, I then grabbed hold of the OpenCover UI (.vsix extension) from here and installed it:

OpenCover UI VS Marketplace Link

Let’s roll on from here with some ‘off the cuff’ observations, rather than in-depth review of features, etc. This serves as simply my first impressions and, ultimately, an insight into if we can get the coverage metrics I am after quickly and easily. For starters, you’ll notice a new context menu for ‘OpenCover’ when Visual Studio boots up:

OpenCover Menu.

OpenCover Menu.

I have quickly shoved the inbuilt Test Explorer window next to the OpenCover variant; they appear to offer a similar ‘look and feel’, in addition to functional grouping options (the default Test Explorer windows appears to have a few more options, in fact). The OpenCover Test Explorer oddly doesn’t have a ‘Run’ or ‘Run All’ tests buttons, on the face of it anyway (or debugging options). Right clicking a test gives a ‘Cover with OpenCover’ context menu option…guess I’ll see what that does now!

Cover with OpenCover Context Menu Option.

Cover with OpenCover Context Menu Option.

At this point I hit the following, immediate, explosion:

Open Cover EXE Error.

OpenCover EXE Error.

You then get prompted to hunt down the relevant .exe file. As I was fishing around for this I decided to go back to the trusty Stack Overflow, to see what wisdom could be uncovered. This was the first hit, which outlined that a configuration file, with set content, needed to be stuffed in with the solution content:

Further Stack Overflow Wisdom

Further comment sniffing did highlight, under Tools > Options, that additional configuration should be performed (i.e. the .exe path should be specified):

Open Cover VS Options.

Open Cover VS Options.

I decided that hunting on NuGet might be the best way to expose an .exe file here (i.e. getting something dropped into a packages directory, which I can easily pick up). So, I followed the hunch by adding this package (just to the ‘tests’ project, for starters, as I wasn’t sure which projects needed targeting):

OpenCover Nuget Package.

OpenCover Nuget Package.

I don’t feel as if we’ve fallen into a rabbit hole just yet, but at this point, I’ve started to wonder if ‘storms’ are on the horizon! Hopefully, we won’t have to tread too much further to get this machine churning. Installing the NuGet package had the desired effect, I have now got the .exe I was looking for lingering in a ‘Tools’ directory, under the OpenCover folder within packages, which I’ve setup in the Visual Studio Options section:

OpenCover EXE Path Configured.

OpenCover EXE Path Configured.

This shouldn’t be marked down as ‘ideal’ configuration, of course; we’re more leaning towards a ‘just get it working’ stance.

The moment of truth…..right clicking and selecting ‘Cover with OpenCover’ now….success! Well, good things appear to have happened anyway, let’s have a quick review to see if we can make sense of it (code with incomplete XML comments is about to be on show, so apologies about that!). I only ran the one test by the way:

Unit Test Code Coverage.

Unit Test Code Coverage.

First observation; it did seem to take a good few seconds before all of the lines covered (green dots) and not covered (red dots) seemed to be highlighted correctly, nothing too catastrophic in this, however. As far as the unit testing specific code goes, here you can clearly see which tests I ran in this instance, the UI pointers are very self-explanatory. One additional observation, it looks like it could be a touch tricky to pick out breakpoints amongst the code coverage markers, but I don’t see this as a big issue at the moment (I’ll have to see how I feel after extended use). In fact, the OpenCover Results window has an option for enabling/disabling these markers, so we’re all good.

You’ll notice that the unit test method denoted here is placing the ‘AddItemCleansingMappingElementToConfiguration’ method under test, so I am keen to see what lines we hit (or ‘covered’) within the targeted method:

Code Actually Under Test Covered.

Code Actually Under Test Covered.

The idea here is that the XML configuration passed to this method is, in fact, malformed so the statement where the ‘addSuccessful’ variable is set to true is not hit (an exception is triggered, and caught, by the preceding line of code); which mirrors the indicator provided by OpenCover, nice! I call this a success!

I’m now going to run this across the board and see what floats to the surface.

Ah, look at this! For starters, OpenCover has highlighted a problem with one of my unit tests in a very solid, visual way (I was wondering why no lines of this test were covered, until I realised I omitted the ‘Test Method’ attribute!):

Missing Test Method Attribute.

Missing Test Method Attribute.

A couple of quirks that deserve to be noted; firstly, I did have to run the ‘Cover with OpenCover’ command twice to register coverage on all tests (some seemed to be omitted from the process, but then included on the second run through). Also, tests that are geared to expect exceptions to be thrown are always marked with their closing brace as ‘not covered’ (I’m assuming that an exception being thrown legitimately causes the final line to never be hit, therefore not covered, which in my head is expected behaviour – it would be good if there was a way to disregard these instances):

Unit Test Missing Coverage.

Unit Test Missing Coverage.

So what about the actual code ‘under test’ and the metrics provided to show how much of it has been covered? In instances whereby code had been highlighted as not covered (spot checks only, of course), I have to say it appears accurate and has been useful in flagging areas I should really have tested.

As for the actual report metrics, it is exactly what I was after when I started on my way down this road. You get to see the percentage of code coverage at project, class and member level (along with ‘Sequence Points’ visited against the total count of possible points):

OpenCover Metrics Report.

OpenCover Metrics Report.

Sequence points don’t tie directly to ‘lines’, as outlined here. You’ll notice that this is a link detailing a ‘Report Generator’, which uses XML extracted using OpenCover directly. To finish up, I’ll follow the steps outlined on Stack Overflow again (got to love it, especially if you need information on the double!):

Using the Report Generator

The Report Generator can be downloaded by using NuGet again:

OpenCover Report Generator on NuGet.

OpenCover Report Generator on NuGet.

The Report Generator source code itself can be downloaded from this link.

It looks like you can create a custom report via C#, by implementing an interface, etc. For now, I’m going to do a simple run through using the command line interface. This is the command (after a bit of trial and error) that got me the XML report, for starters:

"C:\Source\Utility Applications\DesktopManager\packages\OpenCover.4.6.519\tools\OpenCover.Console.exe" -register:user -target:"C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\mstest.exe" -targetargs:"/noisolation /testcontainer:\"C:\Source\Utility Applications\DesktopManager\DesktopManager.Tests\bin\Debug\DesktopManager.Tests.dll\" /resultsfile:C:\Reports\MSTest\.trx" -mergebyhash -output:C:\Reports\MSTest\projectCoverageReport.xml

This was just a case of specifying locations for the OpenCover.console.exe, the mstest.exe and the location of the ‘Tests’ dll for my specific application. XML file in hand, I trigerred this command to generate the final report resources:

"C:\Source\Utility Applications\DesktopManager\packages\ReportGenerator.2.5.2\tools\ReportGenerator.exe" -reports:"C:\Reports\MSTest\projectCoverageReport.xml" -targetdir:"C:\Reports\CodeCoverage"

In a few quick steps you’ll have a set of HTML ‘reports’, as you can see here:

HTML Reports Generated.

HTML Reports Generated.

Let’s finish up with a couple of examples illustrating the outputs:

Index Report.

Index Report.

Configuration Helper Report Details.

Configuration Helper Report Details.

Configuration Helper Report Details.

Configuration Helper Report Line Coverage.

I think that brings us to a close. This seems like pretty powerful stuff; but, I think I’ll need more time to go through some of the outputs and try this with a larger project. I hope this has been fun and/or useful.

Thanks all!

SQL Server Management Studio…Customising those themes!

bear-brush-ssms-cust-themes-post

Howdy all,

I am a bit of a customisation nut; pretty much with any software package I pick up I have the uncontrollable urge to dig straight into the options and get on the hunt for how to ‘theme it up’!

The one that probably upsets me the most is poor old SQL Server Management Studio (SSMS). SSMS has long lagged behind, considering sharing the Visual Studio shell, on the inclusion of a dark theme. I ended up going on a scrummage around as it seems pretty incredible that this isn’t available yet. This is what I managed to dig up:

SQL Server Management Studio Dark/Black Theme

Back in April of 2016, this was the news at the time:

“We currently have the dark theme disabled as we need to do a lot more UI enhancements across all the SSMS dialogs and wizards to make this work properly. We will be working on this and true high DPI support in the coming months.”

The options on offer at the moment, by default, are a snooze fest at best:

Image showing standard colour themes in SSMS.

SSMS Standard Colour Themes.

By way of example, here is the light theme doing its thing:

Image showing the SSMS Light Theme.

SSMS Light Theme.

Alas, it sounds like we might be waiting a while longer for true dark theme support but…if you simply what to customise the code window you can (as in Visual Studio) work some magic in the interim, as follows (yes, the other dockable portions of the UI do not change, but it’s a start).

As a first step, you need to go in search of themes or, more specifically, custom made ‘.vssettings’ files. This is the resource I have used in the past and in this example:

Studio Styles

There are a tonne of pretty decent themes on this site. Here are some that immediately interested me, on first glance (as I’m a dark theme psycho):

  1. Son of Obsidian
  2. Smooth
  3. IR Black
  4. Visual Blend
  5. EaseEye
  6. Electric Energy
  7. Sublime Text 2 – New Edition
  8. Colorful Darkness

If anyone has a good recommendation for other places to go and seek out themes please leave a comment below.

Once you have downloaded any .vssettings files you are interested in (for SQL Server 2016 I have been using the VS 2015 files, without issues thus far), place them in a location that is easily accessible. Now, you need to fly on over to SSMS where the process of importing a .vssettings file is incredibly easy.

Start by navigating to Tools > Import and Export Settings, as below:

Image showing the Import and Export settings on the Tools menu in SSMS.

Import and Export Settings.

Next up, check the option entitled ‘Import selected environment settings’ from the following dialog, and click Next:

Image showing a dialog in SSMS allowing importing of custom themes.

Import Selected Environment Settings.

Next up, you can save your current settings to a .vssettings file, if you choose. This is a pretty solid bet if you have imported a new theme and have made customisations to it yourself, as you will want an easy way to rollback should your eyes be offended by whatever you import next 😉 ! If you like living on the edge there is always the ‘No, just import new settings, overwriting my current settings’ option. Pick the option that makes you feel good, then click Next:

Image showing how to save the current settings in SSMS.

Save Current Theme.

Here comes the meat in the customisation sandwich; browse to the .vssettings file location (where you have stashed one or more settings files) and pick one of your choosing, then click Next:

Image showing the selection of a new settings file.

Select New Settings File.

Check the dialog that pops up next for any warnings and, if you are happy, proceed by clicking Finish.

Image showing the confirm and finish SSMS theme dialog.

Confirm and Finish.

Finally, you should get a confirmation that the import process has completed successfully (or not, but I haven’t seen an outright failure yet!):

Image showing a theme import complete dialog in SSMS.

SSMS Theme Import Complete.

You should, all being well, see an immediate update to the theming in the code window, as shown:

Image showing the SSMS smooth theme in action.

SSMS Smooth Theme.

That’s all there is to it! Again, if anyone has a good resource for themes please let me know, I’m always on the lookout for the cool and unusual.

As an added bonus, here are a few ‘live action’ shots from the bear cave as I was constructing this post:

Thanks for reading, catch you next time!

C# Interactive Initialisation Trick

Hi there,

A little aside this one; something very, very cool that I’ve noticed with the advent of Visual Studio Update 2. Without question, this rates as a MUST CHECK OUT Visual Studio feature.

The first complaint (and a valid one, no question about it) in regards to the C# Interactive Window was that running your own code was a little bit of a painstaking process, requiring you to load dlls and supporting references manually, using the #r command. Now, with update 2, initialising C# Interactive with your applications context is a cinch.

For starters, right-click on the project you are interested in and select the Initialize Interactive with Project option, within the Solution Explorer:

Showing how to Initialise Interactive with Project.

Initialise Interactive with Project Option.

This, as you’ll see, does all of the ‘scaffolding’ work for you within the C# Interactive Window, and injects a using statement for the default namespace to get you started. You should see something like the following:

C# Interactive Window initialised so that the Project is in scope.

C# Interactive Window Initialised with Project.

From here on in, you are free to start adding using statements, as required, and begin writing code against the types in the target project; in a completely pain-free fashion. A massive thumbs up…you can now go wild, like this for example:

Example of using Project Code from C# Interactive.

Using Project Code from C# Interactive.

I did try this on a larger, more complex and tightly-coupled code base and I can comfortably say you’ll have issues depending on how things are structured. Smaller, more loosely-coupled types, where SOLID principals have been adhered to are going to allow you to use C# Interactive in this way with more ease.

It is also possible to highlight portions of code, in any file, and right-click the Execute in Interactive option to run with, and execute, smaller chunks of functionality.

Posts on the way to cover some elements of the ASP.NET Web API and Ninject Dependency Injection (hopefully, at some point over the weekend), so watch this space.

Have a good weekend all and I’ll be back with you in a jiffy…

Coding Snippets in Sublime Text 3

Hello,

This is my coup de grâce into coding snippets, which completes this mini-series (for now, at least). I wanted to do a tiny bit more Python, so began investigating which IDE to look into next:

Stackoverflow – Which Python IDE
Best Python IDE

The Eric IDE project caught my eye initially, and definitely looks like an incredibly interesting prospect. However, I find myself inexorably drawn to try out Sublime Text 3, only because I seem to have crossed paths with a fair few people using it. If anyone is interested in checking out the Eric IDE I’ve included a set of links below for further investigation:

Prerequisites

Eric IDE Prerequisites Listed
Python
PyQt

Downloads

Eric IDE Sourceforge Download

For now, let’s focus our efforts on Sublime Text 3…

Sublime Text 3

To get started, you can navigate to the Sublime Text 3 downloads page and grab the appropriate package for your OS. After reading up, there appears to be a project accessible on GitHub called Anaconda which ‘turns Sublime Text 3 into a full featured Python development IDE’. In order to utilise this functionality, you’ll first need (well, this is the easiest way from the options listed) Package Control, which is essentially a Package Manager for Sublime.

Here are a few basic notes for getting Anaconda setup with Sublime Text 3:

Image showing setup notes on installing Anaconda for Sublime Text 3.

Notes on setting up Anaconda.

In addition, this blog post looks like a strong starting point for investigation into setting up Sublime Text 3 for Python development:

Sublime Text as Python IDE

Just to prove the point, here is Anaconda showing its presence via a context menu in the IDE:

Image showing Anaconda in use within Sublime Text 3.

Anaconda in use.

I’ve taken a few additional steps here. For starters, I wanted to run my code directly within Sublime, which requires configuring the correct Build System, as follows:

Image showing how to setup the Python Build System in Sublime Text 3.

Setting Python Build System.

After performing the Build System configuration you can decide on whether you want to use syntax-only style checks or physically run your code on build (‘Tools > Build With‘):

Image showing the Sublime Build With options.

Sublime Build With.

So, as far as setup goes, that covers it. On to snippets…

Snippets in Sublime

To the main event, how do you create snippets in Sublime Text 3?

Creating the snippet itself couldn’t be simpler, just navigate to ‘Tools > Developer > New Snippet…’ and you’ll be presented with the following:

Image showing the default setup for a snippet in Sublime Text 3.

Snippet Default Setup.

The idea behind creating a snippet is conceptually similar to how you would create them for use within Visual Studio. With the content element (again, we are dealing with an XML format) you’ll find a CDATA definition; snippet code goes between the square brackets. Placeholders can be defined within the code snippet using the $ symbol followed by an incremental number, starting at one (depending on how many placeholders you have of course). The tabTrigger element defines the shortcut that needs to be typed into the IDE code window to bring the snippet into scope. Additional metadata can also be provided to specify the scope of a certain code snippet (e.g. just Python) and a friendly, textual description (stored in the aptly named description element). It sticks pretty close to other snippet creation conventions I’ve seen elsewhere, it’s certainly a cinch to pick up. Here is my first stab at a completed snippet, with comments left in line for reference:

<snippet>
	<content><![CDATA[
def saySomething(name):
	print("$1", name)
]]></content>
	<!-- Optional: Set a tabTrigger to define how to trigger the snippet -->
	<tabTrigger>ss</tabTrigger>
	<!-- Optional: Set a scope to limit where the snippet will trigger -->
	<scope>source.python</scope>
    <description>Say Something Snippet</description>
</snippet>

Snippet files are saved using the .sublime-snippet extension and for the purposes of a quick demonstration I’ve saved mine in the default location of ‘C drive > Users > MY_USER > AppData > Roaming > Sublime Text 3 > Packages > User’, this would require a bit more thought if you were creating more snippets (if you have organisational OCD such as I do). Initial reading does suggest that they should be saved under the ‘User’ directory regardless.

Within Sublime you will then need to save a file with a .py extension, ready to actually construct some Python code! The next two images show the newly created ‘say-something.sublime-snippet’ being brought into scope by keying in ‘ss’ and then hitting tab. As before, placeholders can be tabbed through for fast snippet completion:

Image showing the snippet before use.

Snippet before use.

Image showing the snippet after use.

Snippet after use.

Placeholder values, within snippets, can accept defaults by using the following convention:

<snippet>
	<content><![CDATA[
def saySomething(name):
	print("${1:Hello}", name)
	print (${2:5} + ${3:5})
]]></content>
	<!-- Optional: Set a tabTrigger to define how to trigger the snippet -->
	<tabTrigger>ss</tabTrigger>
	<!-- Optional: Set a scope to limit where the snippet will trigger -->
	<scope>source.python</scope>
    <description>Say Something Snippet</description>
</snippet>

That’s it, mini-series completed! As mentioned before I have a gargantuan mound of topics to cover, some just bubbling over in my mind and others sitting as draft posts ready for completion, so there is plenty of content on its way. Until the next time, have a great weekend!

Coding Snippets (Live Templates) in IntelliJ IDEA (OMG Java!)

Good evening folks!

This is somewhat of an experimental post. I’ve written this purely ‘in the heat of battle’, whilst I was trying to illustrate how to do this. A little bit of a departure from my usual posts; whereby I research a subject and then give you a condensed and refined product. Why did I do this? Just an interesting experiment in all honesty. On the subject of the word honest, this style of post is a chance to be authentic and more in the moment, even if I do trip up here and there and expose myself as the occasional fool; it’s a chance to see my internal monologue as I go about trying to learn something on the fly. I hope you enjoy…

I have an amazing, now completely out of control and rather feral, list of topics in my ‘to blog about’ list. Finding myself in the mood to be a ‘completionist’ however, I really should make something of this coding snippets mini-series and pack a few more of these in before I call it quits.

Today I’m looking at IntelliJ IDEA, a JetBrains IDE for coding in Java (and more than just that, if the tag line is anything to go by). This IDE has been selected out of the magic hat just because I wanted to go for something that I haven’t heard too much about before, rather than a name more frequently bandied about, such as the NetBeans IDE. This received a solid mention in Robert C. Martins ‘The Clean Coder’, which I’ve just finished digesting, so it seems most worthy of an intellectual punt!

Getting your hands on IntelliJ IDEA

To get mitts on this IDE is simple enough. I started by going here:

Choose your Edition

There you’ll find links to a Community and Ultimate edition (with Ultimate having a free, thirty-day trial). As I’m not doing a review per se on this software I won’t dig into the particulars of each edition too heavily. For anyone wondering what the different offerings entail you can navigate to the Compare Edition page and have a gander. What I will say is that language support beyond Java is most certainly a thing (.Net languages are out, as you may expect) and if you’re using Git you appear to be golden in relation to the Community edition; only Ultimate supports TFS.

Let’s get this install done and figure out a) how to write some Java and b) encapsulate something into a snippet.

Installing IntelliJ IDEA

Installing the product is, as you would expect, a next…next…next affair. I opted for the desktop shortcut (I have a messy desktop) and rigged it so that anything with a .java file extension would be associated with this IDE. On running the application you get prompted to import previous, custom settings. I’ve got none so an easy choice for me. After reading those pesky ‘policies’ (that we all read in minute detail of course!) you get, which is a nice touch, an opportunity to pick a theme (for those who know me, its dark themes all the way) and customise/install plugins. Again, I’m not really doing a review but I really liked the ease of the setup process here; being accustomed to Visual Studio for the main part I can often see times when (despite all of the functionality it does offer, I’m not bashing on it) it’s heavyweight nature does show through, not always for the best.

So, we’re ready to rock and roll; being presented with the following screen:

IntelliJ Start Screen.

IntelliJ Start Screen.

Making that Magic Coding Snippet!

Ok, so far, so good. I’m hoping that (although I’ve gazed at a fair bit of Java through my time) not having written a single line of Java is not going to be a ‘blocker’ to success ;-). The remit of this blog post is just to show how to create a code snippet, no matter how simple that is. Besides, I have read and digested the following books, so I’m feeling ready:

Googling the Error Message.

Googling the Error Message.

Copying and Pasting from Stack Overflow.

Copying and Pasting from Stack Overflow.

Create New Project seems like the way to go! I proceeded using the following tutorials by the way, should you wish to follow along:

IntelliJ IDEA Documentation

My first issue appeared to be the fact that I didn’t have the Software Development Kit installed (JDK in this instance), so I navigated here to grab one, for 64 bit Windows in my case:

JDK Downloads

Time for a cup of tea at this point; find a suitable beverage for yourself as you continue to read through, you’ve earned it by hanging on this long :-). On a side note, whoever bought me this mug needs to take a trip to the opticians. I do wonder if I could rock a twirled moustache too:

Someone needs an Optician.

Someone needs an Optician.

I downloaded and installed this package without any bother. Going back to create a new project again, I was able to (yes, I Stack Overflowed my way to glory here – I rather idiotically selected the ‘jre’ folder) hit New and navigate to the root JDK folder under my default Java install directory, as follows:

JDK Directory Selection in the IntelliJ IDEA.

JDK Directory Selection in the IntelliJ IDEA.

This brings me to a promising point on hitting next…I can make a command line application which, right now for me, is grade-a territory; that will do nicely thank you very much (I know where I stand with a Main() method after all):

Create a Command Line Application in IntelliJ.

Create a Command Line Application in IntelliJ.

After being nosey on Java rules of Package Naming, and naming my actual project, I was presented with this amazing sight:

IntelliJ Simple Command Line Application.

IntelliJ Simple Command Line Application.

As you can probably see the project name is ‘FizzBuzz’ related, so I’m going to write code to solve this classic problem and get a code snippet in along the way.

Ten minutes later I have this, which apart from questionable string handling, does the job I’m after (just needed to figure out how to print content to the output window and convert integers to string; refrained from googling here and just went on an IntelliSense rampage, which equals an ‘apologies if this is incorrect’). The output, without any unit tests to back it up, gives me expected ‘Fizz’ for multiples of three, ‘Buzz’ for multiples of five, ‘FizzBuzz’ for multiples of three and five, or just the number for everything else.

package lg.snippet.testing;

public class Main {

    public static void main(String[] args) {
        fizzBuzzerTest();
    }

    private static void fizzBuzzerTest() {
        for (int j = 1; j <= 100; j++) {
            System.out.println(evaluateInput(j));
        }
    }

    private static String evaluateInput(int fizzBuzzInput) {
        String fizzBuzzString = "";

        if (fizzBuzzInput % 3 == 0) {
            fizzBuzzString = "Fizz";
        }

        if (fizzBuzzInput % 5 == 0) {
            fizzBuzzString += "Buzz";
        }

        if (fizzBuzzString == "") {
            fizzBuzzString = Integer.toString(fizzBuzzInput);
        }

        return fizzBuzzString;
    }
}
FizzBuzz Command Line Output.

FizzBuzz Command Line Output.

That’s the best piece of Java I will fashion today, happy for it to be ripped to shreds by all the experts out there.

So, although this has been somewhat of a ramble, we are now ready to fashion a code snippet. I’m going to make the snippet encapsulate the fizzBuzzerTest method and supporting evaluateInput method, as a complete package that can be dropped anywhere. The way this is achieved, after a little research, is via Live Templates.

To get started, navigate to File > Settings and select the Live Templates section under Editor. In this next screenshot, I’m creating a new custom Template Group to act as a repository for my own weird and wonderful snippets:

Live Template Options.

Live Template Options.

After creating the custom template group you can repeat the process (with the new group highlighted) to add a new Live Template. The template itself has a few configuration options that need to have valid settings before you can proceed. Firstly, define what ‘contexts’ this Live Template can operate within; for the purposes of this little example only Java is applicable:

Defining the Live Template Context.

Defining the Live Template Context.

I gave my Live Template an abbreviation (allowing easy accessing from the editor), a Description (the one you’ll see in this screen shot I amend before saving, as it was blooming terrible!) as well as the actual code that defines the template. I’ve placed the variable COUNT into my template, which allows the template user to switch out the value when the template is brought into scope (with a default value as shown here; the dialog for managing this being accessible via the Edit Variables button). Variables must be wrapped in ‘$’ symbols and default literals must be wrapped in double quotes in order for this to work (another google-fest here!):

Completing the Live Template Definition.

Completing the Live Template Definition.

Remit basically complete…we’ve done snippets (well, Live Templates) in Java via IntelliJ IDEA! The next two screen shots show the template in use, achieved by hitting tab to generate the snippet after typing the abbreviation. We are also illustrating the place holder for the COUNT variable being editable. The variables can be tabbed through, as you would expect, for fast template completion:

Before Live Template Usage.

Before Live Template Usage.

After Live Template Usage.

After Live Template Usage.

There’s our random walkabout of IntelliJ IDEA and Live Templates done and dusted, mission complete. Cheers all, until the next time happy coding!

Future Decoded 2015 Play-by-play

Hello beautiful people!

It’s a fantastic, gorgeous Saturday morning (it’ll be Monday by the time I hit the publish button, such is the enormity of the post!); the birds are chirping, the sun is shining through the balcony windows (and there is a bloody wasp outside, STILL!!!) and my wife has left me…………to go on a girly weekend (that probably sounded more alarming than intended; hmmm, oh well, it stays!). Whilst she is away fighting the good fight, this gives me the opportunity to go over my thoughts on the recent Future Decoded 2015 event that took place at ExCel in London.

The links to outline this event have been posted before on my blog, but just in case, here are the goods again:

Future Decoded 2015
Future Decoded 2015: Technical Day Highlights

Before we begin, it’s worth pointing out that I attended this event a couple of weeks ago, so apologies if any inaccuracies pop up. I’ll do my best to stick to the facts of what I can remember and specific points that interested me; other commitments ended up preventing me from getting to this particular post sooner. You’ll all let me off, being the super gracious, awesome folks you are, I’m sure :-).

So, FIGHT!!!!!

Sorry, I had a dream about Mortal Kombat last night and upper-cutting people into the pit – What a great stage that was! Ah, the memories….Let’s begin/start/get on with it then.

Morning Key Notes

The morning Key Notes were varied and expansive in nature. I won’t discuss all of them here, only the takeaway points from the talks that struck a chord with me.

1) Scott Guthrie. EVP Cloud and Enterprise, Microsoft (Azure).

I was particularly looking forward to this talk being a keen follower of Scott Guthrie (include Scott Hanselman), and I normally try to catch up with Channel 9 features and Azure Fridays whenever possible (I’ve linked both, although I’m sure most of you, if not all, have come across Channel 9 before or heard of Azure Fridays).

The talk did have primer elements as you would expect, i.e. here’s the Azure Portal and what you can expect to find (in relation to resources, templates you can access for applications, services, Content Distribution Networks (CDN), etc). The next bit really caught me cold, who was expecting a giant image slide of a cow! I certainly wasn’t…

Estrus in Cows

What followed was a full example of real-time data recording and assessment surrounding the monitoring of cows in Asia. I’ve provided a link below that sums up the concept of Estrus (being in heat) nicely enough, but it laymen’s terms it relates to cows ‘being in the mooooooood’ (wife insisted I added that joke). Obviously, a farmers’ ability to accurately detect this, urm, state of being in a cow is an incredibly important factor in the ability to produce calves.

It turns out that a cow tends to move more when in the Estrus state; something that can certainly be measured. So, with pedometers attached to cows to measure steps taken and an Azure based service receiving and providing feedback in real-time, the farmer in question was able to take action to maximise calf production. Further to this, analysis of the data gathered was able to identify trends against how long cows have been in the Estrus state, and the gender of offspring. Crazy stuff, but all very interesting. Feel free to read further to your hearts content:

Cow Estrus Detection

The Internet of Things (IoT) was briefly touched on and another brief, live coding example ensued.

Scott produced a small, bog-standard heat sensor (apparently, just a few pounds, I was impressed he didn’t say dollars!) and proceeded to demonstrate a basic WinForms application passing a JSON payload to Azure in real-time (measurements taken a few times a second). This strikes me as exciting territory, and I have friends who do develop applications working in tandem with sensors already, backed up by technologies such as the Raspberry Pi and Arduino, for example. The talk closed with the conceptual idea that the majority of data, in the world today, is still largely unmeasured, and hoped that Azure would be an important platform in unlocking developers potential to measure previously untapped data.

2) Kevin Ashton. Inventor of the “Internet of Things”.

Kevin coined the term the Internet of Things (IoT), and gave a very good talk on what this means, as well as identifying certain ‘predictions’ for the future. For instance, that we, as a species, would survive climate change for one. He quickly noted that calling ‘BS’ on this particular one would be tricky should we suffer a doomsday style event at the hands of climate change (I don’t imagine the last thoughts of humanity to be, ‘oh, Kevin Ashton was so bloody wrong!’). Another interesting prediction; we would all own a self-driving car by 2030. Prototype examples already exist, such as Googles (and Apples) efforts, and the Tesla:

Google/Apple (Titan) Self Driving Cars
The Tesla

Self-driving cars being one of the cases in point, the IoT relates to how a whole new host of devices will now become ‘connected’. Besides cars rigged up to the internet, we are all aware of the hooking up of internal systems in our homes (heating, etc) and utility devices (the washing machine), as to always be online and accessible at a moments notice. This world isn’t coming per say, it’s essentially already here.

Pushing past this initial definition, Kevin was keen to stress that the IoT was not limited in its definition to just ‘the connecting of hardware to the internet’ only. Wiki sums this up quite nicely on this occasion, but software (services and analytics) that moves forward with hardware changes will ultimately change the way we live, work, shop and go about our daily lives. Whether this be data relayed from the fridge to google glasses (yes, you are out of milk!), or perhaps a self-driving car ordering ‘click and collect’ shopping and driving you to the collection point after work (not to mention triggering the heating x miles from home!). Software, and the analysis of the new kinds of data we can record from interconnected elements, will be a huge driving force in how our world changes:

Internet of Things (IoT)

Lastly, before I forget and move on, a key phrase voiced several times (although I cannot remember the exact speaker, so apologies for that, it was probably David Chappell) was to reset your defaults. Standard client/server architecture was discussed, and for those of us that are part of long running businesses this is what we are exclusively, or at least partially, dealing with on a daily basis still. However, the change to the use of mobile devices, tablets, etc, as clients and the cloud as the underpinning location for the services these clients communicate with is becoming the norm. For start-ups today, mobile first development and the cloud (Azure or Amazon Web Services (AWS)) are probably the initial go-to.

For some of us (speaking from a personal standpoint only), a major factor in our success as developers could simply be determined by understanding the cloud and getting the necessary experience to make the transition (for those who are not actively taking part in this world of course).

So, now we have the IoT, let’s talk security…

3) Graham Cluley. Security Analyst, grahamcluley.com.

Graham delivered a funny and insightful talk surrounding everyones’, ‘Oh my God, the horror, please kill me’ subject, the wonderful world of security.

In a nutshell, he argues (and certainly proves his point as you’ll read next) that the IoT will bring wonders to our world, but not without issues. We now have a scenario whereby a breadth of new devices have suddenly become internet connected. However, are the driving forces behind these changes the people who are used to dealing with the murky world of malware, viruses and hacking attempts (such as OS developers)? Probably not, is the initial answer. This is, of course, just a cultural divide between those used to trans-versing the security world and protecting devices from such attacks, and those tasked with bringing new devices to the interconnected world.

The hacking of self-driving cars (big topic it would seem) was discussed:

Fiat Chrysler Recalls

Also, the potential of hacking pacemakers was covered (bluetooth/wifi enabled), famously featured in the TV series Homeland and which actually lead to Vice President Dick Cheney’s cardiologist disabling the wireless functionality of his device:

Pacemaker Hacking
Could Pacemakers Be Hacked?

Although funny, the talk did indeed bring up a very serious issue. The ramifications could be catastrophic, depending on the types of devices that ultimately end up being exposed to the masses via the web. Essentially, as the IoT age develops, extra care must be taken to ensure that security is right on up there, in the hierarchy of priorities, when developing software for these devices.

4) Chris Bishop. Scientist and Lab Director, Microsoft Research.

The last talk I would personally like to discuss briefly was by Chris Bishop; there were a few great nuggets here that are well worth covering.

The idea of Machine Learning (not a topic I was overly familiar with for starters), Neural Networks and Pattern Recognition laid the foundation for a talk looking at the possibility of producing machines with human-level, or even super-human, intelligence.

The Microsoft Kinect was used to demonstrate hand-tracking software that, I have to admit, had an incredible amount of fidelity in recognising hand positions and shapes.

Lastly, a facial recognition demonstration that could estimate, with good accuracy, the emotional state of a person was kicked off for us all to see. Very, very impressive. There was most certainly an underlying feeling here (and as much was hinted at) that his kind of technology has many hurdles to jump. For instance, building something that can consume an image and accurately describe what is in that image is still a flaky concept, at best (and the difficulties of producing something capable of this are relatively vast).

Still, a greatly enjoyable talk! A book was touted, and I believe (please don’t shout at me if I’m wrong) this is the one:

Pattern Recognition and Machine Learning

After the morning Key Notes, a series of smaller talks and break-out sessions were available to us. Here’s how I spent my time…

Unity3D Grok Talk

Josh Taylor. Unity Technologies.

It’s my sincere hope that, on discovering this, my employer won’t decide to sack me! This was over lunch and was a self-indulgent decision I’m afraid! You’ll know from some of my historical posts that I have a keen interest in Unity3D (and have spent time making the odd modest prototype game here and there), and I was interested to see how Unity 5 was progressing, especially as a greater cohesive experience with Visual Studio had been promised.

In this short, 20 minute talk, we experienced how Visual Studio (finally) integrates nicely into the Unity3D content creation pipeline. Unity3D now defaults to using Visual Studio as the editor of choice, with Monodevelop being pushed aside. Apologies to anyone who likes Monodevelop, but I’ve never been able to get behind it. With wacky intellisense and with what I can only describe as a crash-tastic experience in past use, I haven’t seen anything yet to sway me from using Visual Studio. In fact, it was demonstrated that you can even use Visual Studio Code if you wish and, as it’s cross-platform, even Mac and Linux users can switch to this if they wish. More reasons to leave Monodevelop in the dust? It’s not for me to say really, go ahead and do what you’ve got to do at the end of the day!

In order to debug Unity projects in Visual Studio in the past a paid for plugin was required. This particular plugin has been purchased by Microsoft and is now available to all. Being able to easily debug code doesn’t sound like much, but trust me it’s like having a basic human right re-established – such good news!!!

The new licensing model was also commented on, a massive plus for everyone. The previous Free/Pro divide is no more; now everyone gets access to the lions share of the core features. You only need to start spending money as you make it (fair for Unity to ask for a piece of the pie if you start rolling in profit/expanding a team to meet the new demand). For me, this means I actually get to use the Unity Pro water effects, hoorah ;-).

Following this, I spent a bit of time last weekend watching the Unite 2015 Key Notes, discussing 2D game development enhancements, cloud based builds and Oculus support. Well worth a look if and when time allows:

Unite 2015 Key Notes

Plus, if Oculus technology interests you, then it’s definitely worth watching John Carmacks (formerly of ID Software, the mind behind Wolfenstein and Doom) Key Note from the Oculus Connect 2 event:

John Carmack Oculus Keynote

Very exciting times ahead for Unity3D I believe. Self-indulgence over, moving forward then…

Journey to the Intelligent Cloud

Corey Sanders. Director of Program Management, Azure.

Following the Unity3D talk, I made my way back to the ICC Auditorium (I missed a small section of this particular talk, but caught the bulk of it) to catch up on some basic examples of how the new Azure Portal can be used. This took the form of a brief overview of what’s available via the portal, essentially a primer session.

In my recent, personal work with Azure I’ve used the publishing capability within Visual Studio to great affect; it was very transparent and seamless to use by all accounts. A sample was provided within this particular session which demonstrated live coding changes, made in GitHub, being published back to a site hosted on Azure.

Going into a tangent….

Very much a personal opinion here, but I did find (and I wasn’t the only one) that a good portion of the content I wanted to see was a) on at the same time (the 1:15pm slot) and b) was during the core lunch period where everyone was ravenous, I’m a ‘hanger’ sufferer I’m afraid. C# with Mads Torgerson, ASP.NET 5, Nano Servers and Windows 10 (UWP) sessions all occupied this slot, which drove me a little nuts :-(. This felt like a scheduling issue if I’m honest. I’d be interested to hear from anyone who did (or didn’t) feel the same.

I was so disappointed to miss Mads Torgerson, I very much enjoyed the recent C# language features overview and would have loved to have made this breakout session! I did walk past him later in the day, and I hope he never reads this, but he seemed ridiculously tall (perhaps Godly C# skills made him appear several inches taller, who knows!). It doesn’t help that I’m on the shorter side either, I just wanted to be 5′ 11″, that’s all I ever wanted (break out the rack, I need to get stretching!). I should have said hello, but wimped out!

F# Language Breakout Session

Don Syme. Principal Researcher, Microsoft Research.

This was easily the part of the event that resonated the most with me, and strongly influenced the foray into F# that I undertook recently. Don Syme, the designer and architect of the F# language, took us through a quality primer of the syntax and how F# can be used (and scaled) for the cloud.

All of this aside, the most impressive part of the talk was a live demonstration of F# Type Providers. Again, this is fully covered in my previous post so I’ll just direct you to that, which in turn will aid me in cutting down what is now becoming a gargantuan post. In summary, the ability to draw information directly from web pages, rip data straight from files and databases, and combine and aggregate it all together using minimal code produces a terse, easy to understand and pretty darn good experience in my book. Even the code behind producing visual feedback, in the form of the charting API, is succinct; the bar really isn’t set too high for new starters to get involved.

If you decide to give anything a go in the near future, I would give F# the nod (followed closely, just a hair’s breadth away, by jQuery in my opinion). Certainly check it out if you get the chance.

Final Key Note

Professor Brian Cox. Physicist.
Krysta Svore. Senior Researcher, Microsoft Research.

The day proceeded in fast forward and, before we’d really had the chance to gather our thoughts, we were sitting in the main auditorium again faced by Professor Brian Cox, Krysta Svore and a menagerie of confused attendees staring at mathematical formulas outlining quantum theory.

Into the wonderful world of quantum computers we dance, and in my case, dragging my brain along from somewhere back yonder in a desperate attempt to keep up. Thankfully, I’m an avid TED talk fanatic and had, in the run up to the event, brushed up on a few quantum theory and quantum mechanics videos; lucky I did really. The content was dense but, for the most part, well put together and outlined the amazing (and potentially frightening) world of possibilities that quantum computers could unlock for us all.

Professor Brian Cox cruised through the theories we’d need to be intimate with in order to understand the onslaught of oncoming content surrounding quantum computers. In essence, a traditional ‘bit’, has a defined state (like a switch), on or off. However, and this is the simple essence of what they were trying to get to, traditional bits are reaching limitations that will prevent us from solving more complex problems, in a timely manner (you’ll see what I mean in a second). Therefore, qubits, born from quantum theory, are the answer.

Now, I’m not going to insult your intelligence and go into too much detail on a subject that I am clearly not an expert in. So, just in ‘laymen’s bullet points’, here is what I took from all that was said and done across the Key Note:

  • With bits, you are dealing with entities that can have a fixed state (0 or 1). A deterministic system if you will, that has limitations in its problem crunching power.
  • Qubits, however, take us into the realm of a probabilistic system. The qubit can be in a superposition of all of the allowed states, not just 0 or 1.
  • Therefore, the problem crunching powers of qubits are exponential in nature, but the probabilistic nature makes measuring them (and interactions involving them) difficult to get to grips with.

So is it worth fighting through the technical problems in order to harness qubits? What kind of gains are we talking about here?

Krystra Svore outlined an example that displayed that it would take roughly one billion years for a current super computer to crack (more complex than standard) RSA encryption. How long would it take a quantum computer you may ask? Significantly faster is the answer, estimated at around one hundred seconds in fact. This clearly defines for us the amazing problems we’ll be able to solve, whilst simultaneously illustrating the dangerous times that lay ahead from a security standpoint. Let’s just hope cryptography keeps up (I can see a few sniffs to suggest things are in the pipeline, so I will keep an eye out for news as it pops up).

So you want a quantum computer I hear you say! Hmmm, I wouldn’t put it on the Christmas list anytime soon. Due to the fact current quantum computers need to be super cooled (and from the pictures we got to see, didn’t look like you could hike around with it!), we’re not likely to get our hands directly on them in the near future.

Can you get your mitts on quantum simulators today? Apparently yes in the answer (completed untested links, just for you to peruse on your own, good luck):

QC Simulators
Project Liquid

Taking nothing away from the Key Note though, it was a concrete finish to an excellent event. Would I go again? You bet! Should we get the train next time instead of driving? Taking into account the mountains of free beer and wine on offer, of course! To finish up, before summarising the Expo itself, if you haven’t been and get the opportunity (in fact, actively seek the opportunity, enough said) then definitely book this in your calendar, thoroughly brilliant.

Expo

Very, very quickly, as I am acutely aware that your ability to focus on this post (if not already) must have completely diminished by this point, I wanted to describe what the Expo itself had to offer. If you’re still reading, give yourself a pat on the back!

One of the more compelling items we saw was the use of the new Lumia phone as a (kind of) desktop replacement attempt. Let’s get one thing straight, you’re not going to be doing hardcore software development using Visual Studio or any other intensive task on this device anytime soon. However, there was certainly enough evidence to suggest that basic productivity tasks would be possible using a mobile phone as a back bone to facilitate this.

The Lumia can be hooked up to a dock, akin to the Surface Pro 4 (the docks are subtly different apparently, so are not cross-compatible), and that allows it to be tied to a display device. You can also get a folding mouse and keyboard, for a very lightweight, on-the-go experience. Interesting certainly, but there is a definite horse-power issue that will prevent anyone working on anything remotely intensive from getting on board. Anyway, for those interested the link below will get you started:

Lumia Docking Station

I saw a few Surface Pros, and wondered whether we could potentially smuggle a few out of the Expo! Only kidding, no need to call the Police (or for anyone I work with thinking I am some kind of master criminal in the making) :-).

An Oculus demonstration booth was on the Expo floor, and displays were hooked up to show what the participants were experiencing. It was noted that a few of the people using the Oculus seemed to miss the point a bit, and kept their head completely still as they were transported through the experience. Once the heads started moving (to actually take in the world) you could visibly see people getting incredibly immersed. Alas, the queues were pretty darn large every time I made my way past, so I didn’t get a chance to experience it first-hand. One for the future.

There was also a programmable cocktail maker, an IoT masterpiece I think you’ll agree. A perfect union of hardware, software and alcohol, a visionary piece illustrating the future has arrived!

The next time an event like this comes around I will endeavour to get a post up in a timely fashion (which will vastly improve the content I hope).

Thanks for reading and a high five from me if you made it this far. Back to coding in the upcoming post I promise, until the next time, cheers from me (and would you believe it, it’s now Tuesday)!